Feb 17 13:25:24 crc systemd[1]: Starting Kubernetes Kubelet... Feb 17 13:25:24 crc restorecon[4698]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 13:25:24 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 13:25:25 crc restorecon[4698]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 13:25:25 crc restorecon[4698]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 17 13:25:26 crc kubenswrapper[4804]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 13:25:26 crc kubenswrapper[4804]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 17 13:25:26 crc kubenswrapper[4804]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 13:25:26 crc kubenswrapper[4804]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 13:25:26 crc kubenswrapper[4804]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 17 13:25:26 crc kubenswrapper[4804]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.331832 4804 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341661 4804 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341717 4804 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341728 4804 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341739 4804 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341749 4804 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341758 4804 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341767 4804 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341775 4804 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341783 4804 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341791 4804 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341800 4804 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341810 4804 feature_gate.go:330] unrecognized feature gate: Example Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341819 4804 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341827 4804 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341835 4804 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341843 4804 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341851 4804 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341859 4804 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341868 4804 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341878 4804 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341888 4804 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341897 4804 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341907 4804 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341922 4804 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341934 4804 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341944 4804 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341954 4804 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341962 4804 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341971 4804 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341979 4804 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341990 4804 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.341999 4804 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342009 4804 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342018 4804 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342030 4804 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342041 4804 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342054 4804 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342064 4804 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342080 4804 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342091 4804 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342100 4804 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342109 4804 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342117 4804 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342126 4804 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342138 4804 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342148 4804 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342159 4804 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342167 4804 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342175 4804 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342185 4804 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342193 4804 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342231 4804 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342240 4804 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342249 4804 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342258 4804 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342272 4804 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342282 4804 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342291 4804 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342299 4804 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342307 4804 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342314 4804 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342322 4804 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342330 4804 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342338 4804 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342349 4804 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342357 4804 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342365 4804 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342373 4804 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342381 4804 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342389 4804 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.342398 4804 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342572 4804 flags.go:64] FLAG: --address="0.0.0.0" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342595 4804 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342612 4804 flags.go:64] FLAG: --anonymous-auth="true" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342625 4804 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342639 4804 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342649 4804 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342663 4804 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342677 4804 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342687 4804 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342696 4804 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342707 4804 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342721 4804 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342731 4804 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342740 4804 flags.go:64] FLAG: --cgroup-root="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342749 4804 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342757 4804 flags.go:64] FLAG: --client-ca-file="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342767 4804 flags.go:64] FLAG: --cloud-config="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342776 4804 flags.go:64] FLAG: --cloud-provider="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342784 4804 flags.go:64] FLAG: --cluster-dns="[]" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342796 4804 flags.go:64] FLAG: --cluster-domain="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342805 4804 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342814 4804 flags.go:64] FLAG: --config-dir="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342823 4804 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342833 4804 flags.go:64] FLAG: --container-log-max-files="5" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342844 4804 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342854 4804 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342862 4804 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342872 4804 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342881 4804 flags.go:64] FLAG: --contention-profiling="false" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342891 4804 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342899 4804 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342909 4804 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342918 4804 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342930 4804 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342939 4804 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342948 4804 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342957 4804 flags.go:64] FLAG: --enable-load-reader="false" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342967 4804 flags.go:64] FLAG: --enable-server="true" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342977 4804 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342988 4804 flags.go:64] FLAG: --event-burst="100" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.342998 4804 flags.go:64] FLAG: --event-qps="50" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343007 4804 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343016 4804 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343025 4804 flags.go:64] FLAG: --eviction-hard="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343037 4804 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343045 4804 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343055 4804 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343068 4804 flags.go:64] FLAG: --eviction-soft="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343080 4804 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343092 4804 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343103 4804 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343114 4804 flags.go:64] FLAG: --experimental-mounter-path="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343125 4804 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343134 4804 flags.go:64] FLAG: --fail-swap-on="true" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343143 4804 flags.go:64] FLAG: --feature-gates="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343155 4804 flags.go:64] FLAG: --file-check-frequency="20s" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343165 4804 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343174 4804 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343184 4804 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343193 4804 flags.go:64] FLAG: --healthz-port="10248" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343235 4804 flags.go:64] FLAG: --help="false" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343244 4804 flags.go:64] FLAG: --hostname-override="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343253 4804 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343262 4804 flags.go:64] FLAG: --http-check-frequency="20s" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343272 4804 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343281 4804 flags.go:64] FLAG: --image-credential-provider-config="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343321 4804 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343331 4804 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343339 4804 flags.go:64] FLAG: --image-service-endpoint="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343349 4804 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343358 4804 flags.go:64] FLAG: --kube-api-burst="100" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343368 4804 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343378 4804 flags.go:64] FLAG: --kube-api-qps="50" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343387 4804 flags.go:64] FLAG: --kube-reserved="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343396 4804 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343405 4804 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343414 4804 flags.go:64] FLAG: --kubelet-cgroups="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343424 4804 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343432 4804 flags.go:64] FLAG: --lock-file="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343441 4804 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343450 4804 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343459 4804 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343475 4804 flags.go:64] FLAG: --log-json-split-stream="false" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343485 4804 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343494 4804 flags.go:64] FLAG: --log-text-split-stream="false" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343505 4804 flags.go:64] FLAG: --logging-format="text" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343514 4804 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343525 4804 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343533 4804 flags.go:64] FLAG: --manifest-url="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343542 4804 flags.go:64] FLAG: --manifest-url-header="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343557 4804 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343565 4804 flags.go:64] FLAG: --max-open-files="1000000" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343576 4804 flags.go:64] FLAG: --max-pods="110" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343585 4804 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343594 4804 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343603 4804 flags.go:64] FLAG: --memory-manager-policy="None" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343612 4804 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343621 4804 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343632 4804 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343642 4804 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343666 4804 flags.go:64] FLAG: --node-status-max-images="50" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343675 4804 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343685 4804 flags.go:64] FLAG: --oom-score-adj="-999" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343694 4804 flags.go:64] FLAG: --pod-cidr="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343703 4804 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343720 4804 flags.go:64] FLAG: --pod-manifest-path="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343729 4804 flags.go:64] FLAG: --pod-max-pids="-1" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343738 4804 flags.go:64] FLAG: --pods-per-core="0" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343749 4804 flags.go:64] FLAG: --port="10250" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343758 4804 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343768 4804 flags.go:64] FLAG: --provider-id="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343777 4804 flags.go:64] FLAG: --qos-reserved="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343787 4804 flags.go:64] FLAG: --read-only-port="10255" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343796 4804 flags.go:64] FLAG: --register-node="true" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343805 4804 flags.go:64] FLAG: --register-schedulable="true" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343815 4804 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343831 4804 flags.go:64] FLAG: --registry-burst="10" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343847 4804 flags.go:64] FLAG: --registry-qps="5" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343856 4804 flags.go:64] FLAG: --reserved-cpus="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343868 4804 flags.go:64] FLAG: --reserved-memory="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343881 4804 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343890 4804 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343899 4804 flags.go:64] FLAG: --rotate-certificates="false" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343908 4804 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343917 4804 flags.go:64] FLAG: --runonce="false" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343928 4804 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343937 4804 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343946 4804 flags.go:64] FLAG: --seccomp-default="false" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343956 4804 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343965 4804 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343976 4804 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343987 4804 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.343996 4804 flags.go:64] FLAG: --storage-driver-password="root" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.344005 4804 flags.go:64] FLAG: --storage-driver-secure="false" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.344014 4804 flags.go:64] FLAG: --storage-driver-table="stats" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.344022 4804 flags.go:64] FLAG: --storage-driver-user="root" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.344032 4804 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.344041 4804 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.344050 4804 flags.go:64] FLAG: --system-cgroups="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.344059 4804 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.344072 4804 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.344082 4804 flags.go:64] FLAG: --tls-cert-file="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.344091 4804 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.344106 4804 flags.go:64] FLAG: --tls-min-version="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.344117 4804 flags.go:64] FLAG: --tls-private-key-file="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.344128 4804 flags.go:64] FLAG: --topology-manager-policy="none" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.344139 4804 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.344151 4804 flags.go:64] FLAG: --topology-manager-scope="container" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.344160 4804 flags.go:64] FLAG: --v="2" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.344177 4804 flags.go:64] FLAG: --version="false" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.344189 4804 flags.go:64] FLAG: --vmodule="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.344231 4804 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.344243 4804 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344509 4804 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344522 4804 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344532 4804 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344540 4804 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344549 4804 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344557 4804 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344566 4804 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344576 4804 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344584 4804 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344592 4804 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344601 4804 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344610 4804 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344618 4804 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344630 4804 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344641 4804 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344653 4804 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344664 4804 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344673 4804 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344681 4804 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344689 4804 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344697 4804 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344705 4804 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344713 4804 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344720 4804 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344728 4804 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344736 4804 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344744 4804 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344751 4804 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344773 4804 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344781 4804 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344789 4804 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344796 4804 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344804 4804 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344814 4804 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344824 4804 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344834 4804 feature_gate.go:330] unrecognized feature gate: Example Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344842 4804 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344850 4804 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344859 4804 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344867 4804 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344875 4804 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344883 4804 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344891 4804 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344898 4804 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344906 4804 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344913 4804 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344921 4804 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344929 4804 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344936 4804 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344945 4804 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344952 4804 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344960 4804 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344968 4804 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344976 4804 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344984 4804 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.344992 4804 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.345000 4804 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.345007 4804 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.345015 4804 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.345023 4804 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.345033 4804 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.345041 4804 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.345048 4804 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.345056 4804 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.345063 4804 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.345071 4804 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.345078 4804 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.345086 4804 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.345094 4804 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.345104 4804 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.345114 4804 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.345145 4804 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.357962 4804 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.358280 4804 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358431 4804 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358444 4804 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358450 4804 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358459 4804 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358465 4804 feature_gate.go:330] unrecognized feature gate: Example Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358470 4804 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358476 4804 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358484 4804 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358491 4804 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358498 4804 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358504 4804 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358510 4804 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358516 4804 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358522 4804 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358530 4804 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358536 4804 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358545 4804 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358556 4804 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358564 4804 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358571 4804 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358577 4804 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358584 4804 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358590 4804 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358597 4804 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358603 4804 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358609 4804 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358616 4804 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358696 4804 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358707 4804 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358715 4804 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358723 4804 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358731 4804 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358737 4804 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358745 4804 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358754 4804 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358761 4804 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358767 4804 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358773 4804 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358779 4804 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358785 4804 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358792 4804 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358799 4804 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358804 4804 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358810 4804 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358816 4804 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358830 4804 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358836 4804 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358841 4804 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358847 4804 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358853 4804 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358859 4804 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358867 4804 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358883 4804 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358891 4804 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358898 4804 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358905 4804 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358912 4804 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358918 4804 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358925 4804 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358932 4804 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358940 4804 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358946 4804 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358953 4804 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358958 4804 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358964 4804 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358970 4804 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358975 4804 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358980 4804 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358986 4804 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.358992 4804 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359000 4804 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.359012 4804 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359306 4804 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359321 4804 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359329 4804 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359336 4804 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359343 4804 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359350 4804 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359356 4804 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359364 4804 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359371 4804 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359378 4804 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359384 4804 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359391 4804 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359397 4804 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359403 4804 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359409 4804 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359414 4804 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359420 4804 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359425 4804 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359431 4804 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359437 4804 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359446 4804 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359452 4804 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359457 4804 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359465 4804 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359475 4804 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359481 4804 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359487 4804 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359494 4804 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359499 4804 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359505 4804 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359512 4804 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359518 4804 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359541 4804 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359546 4804 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359553 4804 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359559 4804 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359565 4804 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359571 4804 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359576 4804 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359582 4804 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359588 4804 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359594 4804 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359600 4804 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359606 4804 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359611 4804 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359619 4804 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359688 4804 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359703 4804 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359709 4804 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359717 4804 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359723 4804 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359731 4804 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359738 4804 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359744 4804 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359750 4804 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359784 4804 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359793 4804 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359800 4804 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359806 4804 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359814 4804 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359821 4804 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359827 4804 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359855 4804 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359861 4804 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359866 4804 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359872 4804 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359878 4804 feature_gate.go:330] unrecognized feature gate: Example Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359884 4804 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359889 4804 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359895 4804 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.359902 4804 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.359911 4804 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.361595 4804 server.go:940] "Client rotation is on, will bootstrap in background" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.367997 4804 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.368150 4804 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.370096 4804 server.go:997] "Starting client certificate rotation" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.370157 4804 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.371125 4804 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-26 11:16:25.096677686 +0000 UTC Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.371278 4804 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.393990 4804 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.396637 4804 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 17 13:25:26 crc kubenswrapper[4804]: E0217 13:25:26.397502 4804 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.414293 4804 log.go:25] "Validated CRI v1 runtime API" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.449646 4804 log.go:25] "Validated CRI v1 image API" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.451834 4804 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.457954 4804 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-17-13-20-50-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.457995 4804 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.477784 4804 manager.go:217] Machine: {Timestamp:2026-02-17 13:25:26.474271897 +0000 UTC m=+0.585691244 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:2305fbdc-66f1-473f-924a-04d713bb59e5 BootID:bf842257-95c9-4f3c-a5d3-b668d3623b7b Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:dc:c8:69 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:dc:c8:69 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:62:e1:f4 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:4f:09:a3 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:97:72:f4 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:d2:39:2c Speed:-1 Mtu:1496} {Name:eth10 MacAddress:e2:70:e4:1f:8d:ca Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:22:12:a5:16:8f:5b Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.478089 4804 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.478368 4804 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.480526 4804 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.480716 4804 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.480764 4804 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.480989 4804 topology_manager.go:138] "Creating topology manager with none policy" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.481001 4804 container_manager_linux.go:303] "Creating device plugin manager" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.481587 4804 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.481624 4804 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.482618 4804 state_mem.go:36] "Initialized new in-memory state store" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.483071 4804 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.486548 4804 kubelet.go:418] "Attempting to sync node with API server" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.486572 4804 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.486619 4804 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.486641 4804 kubelet.go:324] "Adding apiserver pod source" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.486658 4804 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.491007 4804 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.492851 4804 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.492930 4804 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 17 13:25:26 crc kubenswrapper[4804]: E0217 13:25:26.492980 4804 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.493118 4804 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Feb 17 13:25:26 crc kubenswrapper[4804]: E0217 13:25:26.493258 4804 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.495155 4804 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.496664 4804 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.496688 4804 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.496695 4804 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.496723 4804 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.496735 4804 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.496743 4804 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.496751 4804 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.496763 4804 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.496772 4804 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.496781 4804 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.496801 4804 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.496810 4804 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.499295 4804 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.499876 4804 server.go:1280] "Started kubelet" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.501091 4804 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.501238 4804 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.501873 4804 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.501873 4804 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Feb 17 13:25:26 crc systemd[1]: Started Kubernetes Kubelet. Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.512341 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.512404 4804 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.512822 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 18:58:11.345160868 +0000 UTC Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.514241 4804 server.go:460] "Adding debug handlers to kubelet server" Feb 17 13:25:26 crc kubenswrapper[4804]: E0217 13:25:26.514716 4804 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.514266 4804 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.514248 4804 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.514931 4804 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.515693 4804 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Feb 17 13:25:26 crc kubenswrapper[4804]: E0217 13:25:26.515826 4804 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Feb 17 13:25:26 crc kubenswrapper[4804]: E0217 13:25:26.517349 4804 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="200ms" Feb 17 13:25:26 crc kubenswrapper[4804]: E0217 13:25:26.516298 4804 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.146:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.18950b8c7f5626bd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 13:25:26.499837629 +0000 UTC m=+0.611256966,LastTimestamp:2026-02-17 13:25:26.499837629 +0000 UTC m=+0.611256966,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.525364 4804 factory.go:55] Registering systemd factory Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.525400 4804 factory.go:221] Registration of the systemd container factory successfully Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.525855 4804 factory.go:153] Registering CRI-O factory Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.525890 4804 factory.go:221] Registration of the crio container factory successfully Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.526012 4804 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.526048 4804 factory.go:103] Registering Raw factory Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.526067 4804 manager.go:1196] Started watching for new ooms in manager Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.526776 4804 manager.go:319] Starting recovery of all containers Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.529774 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.529853 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.529872 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.529885 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.529897 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.529911 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.529923 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.529937 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.529982 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.529995 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530007 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530020 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530034 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530052 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530070 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530087 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530104 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530120 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530136 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530151 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530166 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530184 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530227 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530244 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530259 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530273 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530295 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530316 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530335 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530350 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530384 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530405 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530422 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530441 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530461 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530479 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530497 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530514 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530532 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530550 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530571 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530618 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530637 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530697 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530720 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530738 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530757 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530775 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530793 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530811 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530830 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530847 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530874 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530894 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530914 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530932 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530954 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530973 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.530991 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531007 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531024 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531044 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531070 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531088 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531106 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531149 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531161 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531177 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531219 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531233 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531246 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531262 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531277 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531291 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531305 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531320 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531333 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531350 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531370 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531389 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531406 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531420 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531435 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531449 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531469 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531487 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.531506 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535074 4804 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535122 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535141 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535156 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535171 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535186 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535307 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535327 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535341 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535355 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535369 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535382 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535397 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535410 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535423 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535437 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535453 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535466 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535486 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535505 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535520 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535548 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535563 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535576 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535591 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535603 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535618 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535633 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535647 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535660 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535675 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535688 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535701 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535716 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535731 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535744 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535760 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535773 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535787 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535799 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535813 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535826 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535837 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535850 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535864 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535875 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535888 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535902 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535914 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535931 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535943 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535954 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535968 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535982 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.535994 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536008 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536021 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536034 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536046 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536059 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536072 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536085 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536098 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536111 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536124 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536138 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536150 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536162 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536176 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536188 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536221 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536235 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536249 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536261 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536275 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536288 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536300 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536313 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536327 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536340 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536353 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536367 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536380 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536394 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536407 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536420 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536431 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536443 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536458 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536471 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536485 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536498 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536511 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536525 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536538 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536551 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536565 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536588 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536601 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536613 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536625 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536638 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536651 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536662 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536677 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536689 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536703 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536718 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536732 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536746 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536761 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536779 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536793 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536809 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536825 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536841 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536855 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536869 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536883 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536896 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536912 4804 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536925 4804 reconstruct.go:97] "Volume reconstruction finished" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.536935 4804 reconciler.go:26] "Reconciler: start to sync state" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.546896 4804 manager.go:324] Recovery completed Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.560653 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.563731 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.563817 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.563837 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.565966 4804 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.565994 4804 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.566020 4804 state_mem.go:36] "Initialized new in-memory state store" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.570656 4804 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.572632 4804 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.572684 4804 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.572725 4804 kubelet.go:2335] "Starting kubelet main sync loop" Feb 17 13:25:26 crc kubenswrapper[4804]: E0217 13:25:26.572783 4804 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 17 13:25:26 crc kubenswrapper[4804]: W0217 13:25:26.574158 4804 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Feb 17 13:25:26 crc kubenswrapper[4804]: E0217 13:25:26.574248 4804 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.584341 4804 policy_none.go:49] "None policy: Start" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.585539 4804 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.585569 4804 state_mem.go:35] "Initializing new in-memory state store" Feb 17 13:25:26 crc kubenswrapper[4804]: E0217 13:25:26.615331 4804 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.642598 4804 manager.go:334] "Starting Device Plugin manager" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.642667 4804 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.642684 4804 server.go:79] "Starting device plugin registration server" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.643332 4804 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.643368 4804 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.643533 4804 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.643656 4804 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.643664 4804 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 17 13:25:26 crc kubenswrapper[4804]: E0217 13:25:26.656146 4804 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.673341 4804 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.673481 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.675258 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.675327 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.675347 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.675601 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.675735 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.675805 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.677281 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.677320 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.677290 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.677360 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.677374 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.677336 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.677553 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.677592 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.677767 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.678506 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.678603 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.678625 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.679190 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.679243 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.679257 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.679369 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.679473 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.679511 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.680270 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.680315 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.680333 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.680546 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.680636 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.680673 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.680778 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.680810 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.680826 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.682518 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.682548 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.682561 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.682522 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.682633 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.682648 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.682818 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.682851 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.683756 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.683789 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.683803 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:26 crc kubenswrapper[4804]: E0217 13:25:26.719243 4804 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="400ms" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.738937 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.739041 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.739080 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.739116 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.739148 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.739178 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.739528 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.739589 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.739678 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.739760 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.739793 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.739818 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.739840 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.739863 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.739889 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.746488 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.748088 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.748155 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.748170 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.748221 4804 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 13:25:26 crc kubenswrapper[4804]: E0217 13:25:26.748822 4804 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.146:6443: connect: connection refused" node="crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.841753 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.841863 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.841938 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.842001 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.842079 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.842084 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.842007 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.842247 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.842302 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.842562 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.842639 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.842784 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.842800 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.842894 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.842924 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.842956 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.842991 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.842931 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.843125 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.843275 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.843374 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.843379 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.843492 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.843498 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.843535 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.843281 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.843685 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.843876 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.843996 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.843935 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.949767 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.951532 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.951606 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.951628 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:26 crc kubenswrapper[4804]: I0217 13:25:26.951673 4804 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 13:25:26 crc kubenswrapper[4804]: E0217 13:25:26.952261 4804 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.146:6443: connect: connection refused" node="crc" Feb 17 13:25:27 crc kubenswrapper[4804]: I0217 13:25:27.025667 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:25:27 crc kubenswrapper[4804]: I0217 13:25:27.039753 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 13:25:27 crc kubenswrapper[4804]: I0217 13:25:27.067923 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 13:25:27 crc kubenswrapper[4804]: I0217 13:25:27.083049 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 17 13:25:27 crc kubenswrapper[4804]: I0217 13:25:27.089393 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:25:27 crc kubenswrapper[4804]: W0217 13:25:27.109953 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-3394a7f8c096efd845c065e2617b995b290896af079653a41de7aa4bacc5bdf4 WatchSource:0}: Error finding container 3394a7f8c096efd845c065e2617b995b290896af079653a41de7aa4bacc5bdf4: Status 404 returned error can't find the container with id 3394a7f8c096efd845c065e2617b995b290896af079653a41de7aa4bacc5bdf4 Feb 17 13:25:27 crc kubenswrapper[4804]: E0217 13:25:27.121008 4804 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="800ms" Feb 17 13:25:27 crc kubenswrapper[4804]: I0217 13:25:27.352759 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:27 crc kubenswrapper[4804]: I0217 13:25:27.355078 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:27 crc kubenswrapper[4804]: I0217 13:25:27.355168 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:27 crc kubenswrapper[4804]: I0217 13:25:27.355183 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:27 crc kubenswrapper[4804]: I0217 13:25:27.355260 4804 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 13:25:27 crc kubenswrapper[4804]: E0217 13:25:27.356111 4804 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.146:6443: connect: connection refused" node="crc" Feb 17 13:25:27 crc kubenswrapper[4804]: W0217 13:25:27.370971 4804 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Feb 17 13:25:27 crc kubenswrapper[4804]: E0217 13:25:27.371120 4804 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Feb 17 13:25:27 crc kubenswrapper[4804]: W0217 13:25:27.433415 4804 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Feb 17 13:25:27 crc kubenswrapper[4804]: E0217 13:25:27.433538 4804 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Feb 17 13:25:27 crc kubenswrapper[4804]: I0217 13:25:27.503257 4804 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Feb 17 13:25:27 crc kubenswrapper[4804]: I0217 13:25:27.513409 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 08:45:28.131066668 +0000 UTC Feb 17 13:25:27 crc kubenswrapper[4804]: I0217 13:25:27.578868 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0a7a934f78e281c8f88227737b7f30d54cb5ca058b47787a991facbf9592952e"} Feb 17 13:25:27 crc kubenswrapper[4804]: I0217 13:25:27.580027 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3394a7f8c096efd845c065e2617b995b290896af079653a41de7aa4bacc5bdf4"} Feb 17 13:25:27 crc kubenswrapper[4804]: I0217 13:25:27.580934 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6ac18128bfad11f4caf6ed0d0b5f6d02428aed2f1d6bebd0a585f011f1e8f3f7"} Feb 17 13:25:27 crc kubenswrapper[4804]: I0217 13:25:27.581775 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"c7635304167e905f1cb3b586b13f91d232901a8c76cf21458c9aa252bd6f3831"} Feb 17 13:25:27 crc kubenswrapper[4804]: I0217 13:25:27.583076 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e67bc0e1272885d0d52ffb35751a295519092d92cdd212d3e948bda8734caaeb"} Feb 17 13:25:27 crc kubenswrapper[4804]: E0217 13:25:27.921733 4804 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="1.6s" Feb 17 13:25:27 crc kubenswrapper[4804]: W0217 13:25:27.927862 4804 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Feb 17 13:25:27 crc kubenswrapper[4804]: E0217 13:25:27.927997 4804 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Feb 17 13:25:28 crc kubenswrapper[4804]: W0217 13:25:28.092033 4804 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Feb 17 13:25:28 crc kubenswrapper[4804]: E0217 13:25:28.092130 4804 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.156766 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.158431 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.158471 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.158481 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.158505 4804 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 13:25:28 crc kubenswrapper[4804]: E0217 13:25:28.159016 4804 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.146:6443: connect: connection refused" node="crc" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.504084 4804 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.514529 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 05:20:50.776575272 +0000 UTC Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.525314 4804 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 17 13:25:28 crc kubenswrapper[4804]: E0217 13:25:28.527298 4804 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.589237 4804 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="2677496c02d25e670b94e00ddc43760e48755555d16ea7c6fb56c8bbebc19a44" exitCode=0 Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.589312 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"2677496c02d25e670b94e00ddc43760e48755555d16ea7c6fb56c8bbebc19a44"} Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.589419 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.592015 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.592063 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.592076 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.596580 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930"} Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.596623 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696"} Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.596641 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7"} Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.596652 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d"} Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.596669 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.599916 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.599971 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.599992 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.602436 4804 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8" exitCode=0 Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.602548 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8"} Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.602601 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.603564 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.603585 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.603594 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.604652 4804 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ec7641c3e61e45ce165b538d77e41c41463fe218e5274bb57372944010127cd4" exitCode=0 Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.604687 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ec7641c3e61e45ce165b538d77e41c41463fe218e5274bb57372944010127cd4"} Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.604745 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.605621 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.605661 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.605676 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.607733 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.608480 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.608499 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.608509 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.608553 4804 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="31ae10eb113c4f6e69ec71e2ef5e301093278d304f6fd564c0bfaa68aae3df53" exitCode=0 Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.608714 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.609253 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"31ae10eb113c4f6e69ec71e2ef5e301093278d304f6fd564c0bfaa68aae3df53"} Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.609893 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.609927 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.609940 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.846505 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:25:28 crc kubenswrapper[4804]: I0217 13:25:28.870588 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:25:28 crc kubenswrapper[4804]: E0217 13:25:28.878233 4804 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.146:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.18950b8c7f5626bd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 13:25:26.499837629 +0000 UTC m=+0.611256966,LastTimestamp:2026-02-17 13:25:26.499837629 +0000 UTC m=+0.611256966,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 13:25:29 crc kubenswrapper[4804]: W0217 13:25:29.480145 4804 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Feb 17 13:25:29 crc kubenswrapper[4804]: E0217 13:25:29.480304 4804 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.503069 4804 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.515441 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 00:57:36.349367205 +0000 UTC Feb 17 13:25:29 crc kubenswrapper[4804]: E0217 13:25:29.523290 4804 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="3.2s" Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.623661 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5b261eeb544e51ee2adb543542186c18759e667e1339fd724a3dce82b0391aba"} Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.623718 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6dbdcf29ae01c66c5b1c3ed9c4aa7b3a6451bb49e10d52c319a2744105303386"} Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.626420 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8"} Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.626481 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094"} Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.628905 4804 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="253431bca5d3b9f01e549f7c312eacf3f14ac51ef1c78bea9bb825f13ee2e119" exitCode=0 Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.629048 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"253431bca5d3b9f01e549f7c312eacf3f14ac51ef1c78bea9bb825f13ee2e119"} Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.629166 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.630700 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.630740 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.630754 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.632789 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.632817 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"c16d90fa3ea6207e30c8c7d82c6d77586b791fbe1a490094e34f01371a61d89b"} Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.632797 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.633696 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.633723 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.633734 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.633833 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.633865 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.633878 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.759098 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.760593 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.760644 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.760658 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:29 crc kubenswrapper[4804]: I0217 13:25:29.760687 4804 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 13:25:29 crc kubenswrapper[4804]: E0217 13:25:29.762030 4804 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.146:6443: connect: connection refused" node="crc" Feb 17 13:25:30 crc kubenswrapper[4804]: W0217 13:25:30.117683 4804 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Feb 17 13:25:30 crc kubenswrapper[4804]: E0217 13:25:30.117784 4804 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.267985 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.504013 4804 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.516243 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 23:58:04.544354523 +0000 UTC Feb 17 13:25:30 crc kubenswrapper[4804]: W0217 13:25:30.629295 4804 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Feb 17 13:25:30 crc kubenswrapper[4804]: E0217 13:25:30.629398 4804 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.640254 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ce8a58720e24f8e51cc0749b46bb8ae7a1ecd576acaf200cce798a109e2e46fe"} Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.640314 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.641232 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.641270 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.641285 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.644152 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533"} Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.644181 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c"} Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.644212 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc"} Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.644266 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.646434 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.646478 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.646488 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.648056 4804 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="70e01634eee46cf26502b33fa597c0d4e345be38e1f95e56ca07dba16ce6367e" exitCode=0 Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.648167 4804 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.648224 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.648856 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"70e01634eee46cf26502b33fa597c0d4e345be38e1f95e56ca07dba16ce6367e"} Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.648928 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.648974 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.650270 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.650306 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.650320 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.650286 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.650385 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.650397 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.651357 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.651387 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:30 crc kubenswrapper[4804]: I0217 13:25:30.651403 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:31 crc kubenswrapper[4804]: W0217 13:25:31.079585 4804 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Feb 17 13:25:31 crc kubenswrapper[4804]: E0217 13:25:31.079711 4804 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Feb 17 13:25:31 crc kubenswrapper[4804]: I0217 13:25:31.516416 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 12:30:48.596444804 +0000 UTC Feb 17 13:25:31 crc kubenswrapper[4804]: I0217 13:25:31.657401 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:31 crc kubenswrapper[4804]: I0217 13:25:31.658653 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"541cb2bc26adf968b3e261905d4f54392f7e0f3fb688675af189c3feeae5296f"} Feb 17 13:25:31 crc kubenswrapper[4804]: I0217 13:25:31.658741 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2febdf4e794d3ec70cd39dedfc469d822d38420a688d1f0599e3cc416851fdb3"} Feb 17 13:25:31 crc kubenswrapper[4804]: I0217 13:25:31.658766 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"39b06f8f75ba5208ad068cb319c4f3b420d3ffb99f9fb677a10859dafd34f27a"} Feb 17 13:25:31 crc kubenswrapper[4804]: I0217 13:25:31.658817 4804 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 13:25:31 crc kubenswrapper[4804]: I0217 13:25:31.658848 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 13:25:31 crc kubenswrapper[4804]: I0217 13:25:31.658914 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:31 crc kubenswrapper[4804]: I0217 13:25:31.658821 4804 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 13:25:31 crc kubenswrapper[4804]: I0217 13:25:31.659149 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:31 crc kubenswrapper[4804]: I0217 13:25:31.661881 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:31 crc kubenswrapper[4804]: I0217 13:25:31.661941 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:31 crc kubenswrapper[4804]: I0217 13:25:31.661972 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:31 crc kubenswrapper[4804]: I0217 13:25:31.661986 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:31 crc kubenswrapper[4804]: I0217 13:25:31.662055 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:31 crc kubenswrapper[4804]: I0217 13:25:31.662087 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:31 crc kubenswrapper[4804]: I0217 13:25:31.662106 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:31 crc kubenswrapper[4804]: I0217 13:25:31.661944 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:31 crc kubenswrapper[4804]: I0217 13:25:31.662890 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:32 crc kubenswrapper[4804]: I0217 13:25:32.516631 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 14:44:53.158254255 +0000 UTC Feb 17 13:25:32 crc kubenswrapper[4804]: I0217 13:25:32.667025 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8dd69ec5e7306ece674a51d346fd95fc0733bdeb9623ac1bddd3d0f8a48cc421"} Feb 17 13:25:32 crc kubenswrapper[4804]: I0217 13:25:32.667132 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:32 crc kubenswrapper[4804]: I0217 13:25:32.667132 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"539310a045c9a289318eb5035a3ba0c7f77907ff51dff0f5df5210e67acda4b5"} Feb 17 13:25:32 crc kubenswrapper[4804]: I0217 13:25:32.667138 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:32 crc kubenswrapper[4804]: I0217 13:25:32.668774 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:32 crc kubenswrapper[4804]: I0217 13:25:32.668823 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:32 crc kubenswrapper[4804]: I0217 13:25:32.668840 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:32 crc kubenswrapper[4804]: I0217 13:25:32.670008 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:32 crc kubenswrapper[4804]: I0217 13:25:32.670052 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:32 crc kubenswrapper[4804]: I0217 13:25:32.670068 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:32 crc kubenswrapper[4804]: I0217 13:25:32.823629 4804 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 17 13:25:32 crc kubenswrapper[4804]: I0217 13:25:32.963061 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:32 crc kubenswrapper[4804]: I0217 13:25:32.965074 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:32 crc kubenswrapper[4804]: I0217 13:25:32.965130 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:32 crc kubenswrapper[4804]: I0217 13:25:32.965150 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:32 crc kubenswrapper[4804]: I0217 13:25:32.965234 4804 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.269098 4804 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.269298 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.517765 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 14:32:31.018845354 +0000 UTC Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.639842 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.640118 4804 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.640240 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.642164 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.642368 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.642413 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.669444 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.670822 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.670879 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.670923 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.785490 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.785758 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.787567 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.787624 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.787647 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.950585 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.951003 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.952896 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.952972 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:33 crc kubenswrapper[4804]: I0217 13:25:33.952992 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:34 crc kubenswrapper[4804]: I0217 13:25:34.518739 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 05:44:58.939205993 +0000 UTC Feb 17 13:25:34 crc kubenswrapper[4804]: I0217 13:25:34.581696 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 17 13:25:34 crc kubenswrapper[4804]: I0217 13:25:34.672864 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:34 crc kubenswrapper[4804]: I0217 13:25:34.674164 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:34 crc kubenswrapper[4804]: I0217 13:25:34.674263 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:34 crc kubenswrapper[4804]: I0217 13:25:34.674290 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:35 crc kubenswrapper[4804]: I0217 13:25:35.420876 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:25:35 crc kubenswrapper[4804]: I0217 13:25:35.421257 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:35 crc kubenswrapper[4804]: I0217 13:25:35.423329 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:35 crc kubenswrapper[4804]: I0217 13:25:35.423432 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:35 crc kubenswrapper[4804]: I0217 13:25:35.423462 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:35 crc kubenswrapper[4804]: I0217 13:25:35.518927 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 09:12:58.142182787 +0000 UTC Feb 17 13:25:36 crc kubenswrapper[4804]: I0217 13:25:36.519272 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 08:24:34.262556302 +0000 UTC Feb 17 13:25:36 crc kubenswrapper[4804]: E0217 13:25:36.656488 4804 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 17 13:25:37 crc kubenswrapper[4804]: I0217 13:25:37.519943 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 23:01:03.629208781 +0000 UTC Feb 17 13:25:37 crc kubenswrapper[4804]: I0217 13:25:37.613485 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:25:37 crc kubenswrapper[4804]: I0217 13:25:37.613665 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:37 crc kubenswrapper[4804]: I0217 13:25:37.615083 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:37 crc kubenswrapper[4804]: I0217 13:25:37.615169 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:37 crc kubenswrapper[4804]: I0217 13:25:37.615187 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:37 crc kubenswrapper[4804]: I0217 13:25:37.621019 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:25:37 crc kubenswrapper[4804]: I0217 13:25:37.683430 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:37 crc kubenswrapper[4804]: I0217 13:25:37.685111 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:37 crc kubenswrapper[4804]: I0217 13:25:37.685185 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:37 crc kubenswrapper[4804]: I0217 13:25:37.685238 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:38 crc kubenswrapper[4804]: I0217 13:25:38.520239 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 03:28:56.933397153 +0000 UTC Feb 17 13:25:39 crc kubenswrapper[4804]: I0217 13:25:39.521542 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 19:01:25.343788772 +0000 UTC Feb 17 13:25:40 crc kubenswrapper[4804]: I0217 13:25:40.522118 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 03:38:47.531707637 +0000 UTC Feb 17 13:25:40 crc kubenswrapper[4804]: I0217 13:25:40.899880 4804 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 17 13:25:40 crc kubenswrapper[4804]: I0217 13:25:40.900023 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 17 13:25:41 crc kubenswrapper[4804]: I0217 13:25:41.504079 4804 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Feb 17 13:25:41 crc kubenswrapper[4804]: I0217 13:25:41.522795 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 20:07:22.40900653 +0000 UTC Feb 17 13:25:41 crc kubenswrapper[4804]: I0217 13:25:41.538141 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 17 13:25:41 crc kubenswrapper[4804]: I0217 13:25:41.538346 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:41 crc kubenswrapper[4804]: I0217 13:25:41.539542 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:41 crc kubenswrapper[4804]: I0217 13:25:41.539582 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:41 crc kubenswrapper[4804]: I0217 13:25:41.539591 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:41 crc kubenswrapper[4804]: I0217 13:25:41.586890 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 17 13:25:41 crc kubenswrapper[4804]: I0217 13:25:41.904769 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:41 crc kubenswrapper[4804]: I0217 13:25:41.905929 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:41 crc kubenswrapper[4804]: I0217 13:25:41.905976 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:41 crc kubenswrapper[4804]: I0217 13:25:41.905989 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:41 crc kubenswrapper[4804]: I0217 13:25:41.920953 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 17 13:25:42 crc kubenswrapper[4804]: I0217 13:25:42.431543 4804 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 17 13:25:42 crc kubenswrapper[4804]: I0217 13:25:42.431609 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 17 13:25:42 crc kubenswrapper[4804]: I0217 13:25:42.438120 4804 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 17 13:25:42 crc kubenswrapper[4804]: I0217 13:25:42.438211 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 17 13:25:42 crc kubenswrapper[4804]: I0217 13:25:42.523688 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 02:04:19.878062397 +0000 UTC Feb 17 13:25:42 crc kubenswrapper[4804]: I0217 13:25:42.906909 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:42 crc kubenswrapper[4804]: I0217 13:25:42.908179 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:42 crc kubenswrapper[4804]: I0217 13:25:42.908229 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:42 crc kubenswrapper[4804]: I0217 13:25:42.908238 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:43 crc kubenswrapper[4804]: I0217 13:25:43.269446 4804 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 13:25:43 crc kubenswrapper[4804]: I0217 13:25:43.269513 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 13:25:43 crc kubenswrapper[4804]: I0217 13:25:43.524333 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 23:23:42.939656301 +0000 UTC Feb 17 13:25:44 crc kubenswrapper[4804]: I0217 13:25:44.524912 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 13:57:19.226521544 +0000 UTC Feb 17 13:25:45 crc kubenswrapper[4804]: I0217 13:25:45.435079 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:25:45 crc kubenswrapper[4804]: I0217 13:25:45.435266 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:45 crc kubenswrapper[4804]: I0217 13:25:45.436730 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:45 crc kubenswrapper[4804]: I0217 13:25:45.436759 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:45 crc kubenswrapper[4804]: I0217 13:25:45.436768 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:45 crc kubenswrapper[4804]: I0217 13:25:45.440336 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:25:45 crc kubenswrapper[4804]: I0217 13:25:45.525633 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 17:55:45.442618736 +0000 UTC Feb 17 13:25:45 crc kubenswrapper[4804]: I0217 13:25:45.916355 4804 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 13:25:45 crc kubenswrapper[4804]: I0217 13:25:45.916415 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:45 crc kubenswrapper[4804]: I0217 13:25:45.917518 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:45 crc kubenswrapper[4804]: I0217 13:25:45.917559 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:45 crc kubenswrapper[4804]: I0217 13:25:45.917570 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:46 crc kubenswrapper[4804]: I0217 13:25:46.526544 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 22:54:17.913843945 +0000 UTC Feb 17 13:25:46 crc kubenswrapper[4804]: E0217 13:25:46.656686 4804 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 17 13:25:47 crc kubenswrapper[4804]: E0217 13:25:47.421632 4804 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.423467 4804 trace.go:236] Trace[1758872461]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 13:25:33.913) (total time: 13510ms): Feb 17 13:25:47 crc kubenswrapper[4804]: Trace[1758872461]: ---"Objects listed" error: 13510ms (13:25:47.423) Feb 17 13:25:47 crc kubenswrapper[4804]: Trace[1758872461]: [13.510324875s] [13.510324875s] END Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.423496 4804 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.425542 4804 trace.go:236] Trace[1424273427]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 13:25:32.834) (total time: 14590ms): Feb 17 13:25:47 crc kubenswrapper[4804]: Trace[1424273427]: ---"Objects listed" error: 14590ms (13:25:47.425) Feb 17 13:25:47 crc kubenswrapper[4804]: Trace[1424273427]: [14.590838409s] [14.590838409s] END Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.425570 4804 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.425632 4804 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.428402 4804 trace.go:236] Trace[794050110]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 13:25:36.730) (total time: 10698ms): Feb 17 13:25:47 crc kubenswrapper[4804]: Trace[794050110]: ---"Objects listed" error: 10697ms (13:25:47.428) Feb 17 13:25:47 crc kubenswrapper[4804]: Trace[794050110]: [10.698059931s] [10.698059931s] END Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.428442 4804 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.429189 4804 trace.go:236] Trace[2072162229]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 13:25:34.608) (total time: 12821ms): Feb 17 13:25:47 crc kubenswrapper[4804]: Trace[2072162229]: ---"Objects listed" error: 12821ms (13:25:47.429) Feb 17 13:25:47 crc kubenswrapper[4804]: Trace[2072162229]: [12.821148164s] [12.821148164s] END Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.429234 4804 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 17 13:25:47 crc kubenswrapper[4804]: E0217 13:25:47.429374 4804 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.434307 4804 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.467875 4804 csr.go:261] certificate signing request csr-z8hrm is approved, waiting to be issued Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.471939 4804 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:53372->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.472045 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:53372->192.168.126.11:17697: read: connection reset by peer" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.472466 4804 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.472509 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.476937 4804 csr.go:257] certificate signing request csr-z8hrm is issued Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.500985 4804 apiserver.go:52] "Watching apiserver" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.503796 4804 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.504174 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.504692 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.504735 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.505102 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:25:47 crc kubenswrapper[4804]: E0217 13:25:47.506172 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.506570 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 13:25:47 crc kubenswrapper[4804]: E0217 13:25:47.505060 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.506651 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.506922 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:25:47 crc kubenswrapper[4804]: E0217 13:25:47.506990 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.510632 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.510869 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.511075 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.511392 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.511671 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.511756 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.511908 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.512420 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.514099 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.516477 4804 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.526722 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.526792 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.526828 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.526856 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.526884 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.526909 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.526929 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.526949 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.526975 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.526994 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.526953 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 12:45:48.479668538 +0000 UTC Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.527012 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.527113 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.527169 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.527208 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.527278 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.527331 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.527381 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.527415 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.527454 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.527501 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.527521 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.527536 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.527575 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.527799 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.527851 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.527882 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.527893 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.527970 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528060 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528099 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528134 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528167 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528325 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528364 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528395 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528415 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528424 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528507 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528539 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528571 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528605 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528631 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528659 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528686 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528719 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528745 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528771 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528794 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528818 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528843 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528868 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528893 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528924 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528956 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528981 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529008 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529035 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529060 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529085 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529115 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529140 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529166 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529201 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529241 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529266 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529295 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529320 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529349 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529375 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529400 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529425 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529452 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529477 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529501 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529528 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529549 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529572 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529597 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529627 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529652 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529675 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529699 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529724 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529750 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529772 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529817 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529841 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529865 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529890 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529913 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529938 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529962 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529987 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530013 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530036 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530064 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530089 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530112 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530134 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530158 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530184 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530664 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530708 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530727 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530747 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530766 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530786 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530805 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530824 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530842 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530859 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530878 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530897 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530917 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530963 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530981 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531001 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531017 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531035 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531053 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531072 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531090 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531156 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531172 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531189 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531208 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531248 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531268 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531284 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531302 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531318 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531335 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531355 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531373 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531396 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531411 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531429 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531446 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531465 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531484 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531501 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531519 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531538 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531555 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531573 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531592 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531670 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531688 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531707 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531723 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531740 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531759 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531776 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531795 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531814 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531832 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531851 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531869 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531890 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531912 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531934 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531953 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531972 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531990 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532011 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532028 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532046 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532064 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532080 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532098 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532118 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532136 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532155 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532172 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532188 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532209 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532265 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532287 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532305 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532322 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532341 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532360 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532393 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532410 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532428 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532446 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532463 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532482 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532500 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532518 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532534 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532551 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532568 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532585 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532607 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532625 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532644 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532661 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532678 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532695 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532736 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532765 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532791 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532812 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532835 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532853 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532878 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532900 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532926 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532948 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532969 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532989 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.533013 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.533032 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.533108 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.533120 4804 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.533132 4804 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.533143 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.533154 4804 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528565 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528680 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.528893 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529112 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529351 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529573 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529669 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529919 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529951 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530124 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.529248 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530636 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530810 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530856 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.530891 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531015 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531033 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531196 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531523 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531885 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.531924 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532254 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532552 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.541558 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532579 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532669 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.532916 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.533276 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.533526 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.533537 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.533538 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.534245 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.534440 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.534594 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.536720 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.536925 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.537161 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.538590 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.538848 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.538929 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.539008 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.539212 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.539445 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.539486 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.539623 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.539690 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.540026 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.540044 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.540255 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.540270 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.540510 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.540963 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.541905 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.541137 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.541052 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.541823 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.542250 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.542575 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.542890 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.542902 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.543567 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.544062 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.544305 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.544398 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.545267 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.545384 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.545896 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.546142 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.546253 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.546264 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.547108 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.547178 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.548300 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.548503 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.548589 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.548877 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.549539 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.549822 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.550325 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.550542 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.550612 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.550894 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.550934 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.551003 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.551308 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.551334 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.551618 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.552157 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.552303 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.552628 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.554459 4804 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.554924 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.558757 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.559401 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.559533 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.559933 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.560620 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.560651 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.560663 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.561034 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.561528 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.561657 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.561774 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.562030 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.562110 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.562204 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.562277 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.562389 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.562368 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.562450 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.562747 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.562935 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.563248 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.563477 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.563561 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.563555 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: E0217 13:25:47.563925 4804 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.563962 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: E0217 13:25:47.564019 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 13:25:48.063992362 +0000 UTC m=+22.175411709 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.564297 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.564328 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.564342 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.564574 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.564911 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: E0217 13:25:47.566351 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:25:48.066329398 +0000 UTC m=+22.177748735 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.566377 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.566404 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.566790 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.566956 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.567178 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: E0217 13:25:47.567463 4804 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 13:25:47 crc kubenswrapper[4804]: E0217 13:25:47.567604 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 13:25:48.067583329 +0000 UTC m=+22.179002666 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.563999 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.567244 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.568048 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.568340 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.568535 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.569092 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.569742 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.569948 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.570051 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.567282 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.570250 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.570317 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.570762 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.570781 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.571405 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.571499 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.571579 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.571744 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.572177 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.572378 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.572561 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.573470 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.573543 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.574233 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.574706 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.574769 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.575106 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.576585 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.576699 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: E0217 13:25:47.576918 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 13:25:47 crc kubenswrapper[4804]: E0217 13:25:47.577015 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 13:25:47 crc kubenswrapper[4804]: E0217 13:25:47.577092 4804 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.577383 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: E0217 13:25:47.577508 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 13:25:48.077213254 +0000 UTC m=+22.188632801 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.577821 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: E0217 13:25:47.581021 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 13:25:47 crc kubenswrapper[4804]: E0217 13:25:47.581056 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 13:25:47 crc kubenswrapper[4804]: E0217 13:25:47.581076 4804 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:25:47 crc kubenswrapper[4804]: E0217 13:25:47.581141 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 13:25:48.081120162 +0000 UTC m=+22.192539499 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.585073 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.585684 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.586415 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.590590 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.594486 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.596199 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-4fbbv"] Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.596470 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.596692 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4fbbv" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.597341 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.597530 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.598295 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.598432 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.598594 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.599004 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.599161 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.599740 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.600167 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.600439 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.600552 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.600611 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.600615 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.600974 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.601041 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.602877 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.603201 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.603435 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.603441 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.603465 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.603584 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.603862 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.603311 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.604406 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.605206 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.605267 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.605337 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.605533 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.605580 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.605705 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.606442 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.606425 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.609608 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.613169 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.614605 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.616572 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.627933 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.634732 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdwzf\" (UniqueName: \"kubernetes.io/projected/9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc-kube-api-access-tdwzf\") pod \"node-resolver-4fbbv\" (UID: \"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\") " pod="openshift-dns/node-resolver-4fbbv" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.634833 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc-hosts-file\") pod \"node-resolver-4fbbv\" (UID: \"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\") " pod="openshift-dns/node-resolver-4fbbv" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.634862 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.634937 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.634988 4804 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.634999 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635010 4804 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635020 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635028 4804 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635037 4804 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635045 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635054 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635062 4804 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635073 4804 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635082 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635093 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635102 4804 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635110 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635120 4804 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635130 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635139 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635147 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635156 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635164 4804 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635174 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635182 4804 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635193 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635206 4804 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635229 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635238 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635247 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635255 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635264 4804 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635272 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635280 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635290 4804 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635299 4804 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635307 4804 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635317 4804 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635326 4804 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635334 4804 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635344 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635353 4804 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635362 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635371 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635379 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635388 4804 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635396 4804 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635404 4804 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635412 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635432 4804 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635441 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635449 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635457 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635465 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635474 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635485 4804 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635493 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635501 4804 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635509 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635518 4804 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635526 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635533 4804 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635542 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635550 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635559 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635568 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635576 4804 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635584 4804 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635592 4804 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635600 4804 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635608 4804 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635616 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635626 4804 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635635 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635644 4804 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635653 4804 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635662 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635671 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635679 4804 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635687 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635695 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635703 4804 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635711 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635719 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635727 4804 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635736 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635745 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635754 4804 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635763 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635771 4804 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635780 4804 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635788 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635797 4804 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635805 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635813 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635821 4804 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635829 4804 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635837 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635845 4804 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635854 4804 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635862 4804 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635870 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635877 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635885 4804 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635894 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635902 4804 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635909 4804 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635918 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635927 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635935 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635943 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635951 4804 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635959 4804 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635967 4804 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635975 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635982 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635991 4804 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.635999 4804 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636006 4804 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636015 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636025 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636041 4804 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636050 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636058 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636066 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636074 4804 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636082 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636090 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636098 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636106 4804 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636114 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636123 4804 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636132 4804 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636141 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636149 4804 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636157 4804 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636165 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636174 4804 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636182 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636191 4804 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636201 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636210 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636233 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636241 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636249 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636258 4804 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636267 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636275 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636284 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636292 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636301 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636310 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636318 4804 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636327 4804 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636339 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636347 4804 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636356 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636363 4804 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636371 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636379 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636386 4804 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636395 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636402 4804 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636411 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636418 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636426 4804 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636433 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636441 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636449 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636457 4804 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636465 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636473 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636481 4804 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636489 4804 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636497 4804 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636505 4804 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636513 4804 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636521 4804 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636529 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636537 4804 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636545 4804 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636554 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636562 4804 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636571 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636580 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636588 4804 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636600 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636608 4804 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636618 4804 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.636674 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.637615 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.653935 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.665880 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.669010 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.686498 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.688152 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.711421 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.732460 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.737806 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdwzf\" (UniqueName: \"kubernetes.io/projected/9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc-kube-api-access-tdwzf\") pod \"node-resolver-4fbbv\" (UID: \"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\") " pod="openshift-dns/node-resolver-4fbbv" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.737857 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc-hosts-file\") pod \"node-resolver-4fbbv\" (UID: \"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\") " pod="openshift-dns/node-resolver-4fbbv" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.737888 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.737898 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.737907 4804 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.738060 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc-hosts-file\") pod \"node-resolver-4fbbv\" (UID: \"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\") " pod="openshift-dns/node-resolver-4fbbv" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.746118 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.756282 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.762927 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdwzf\" (UniqueName: \"kubernetes.io/projected/9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc-kube-api-access-tdwzf\") pod \"node-resolver-4fbbv\" (UID: \"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\") " pod="openshift-dns/node-resolver-4fbbv" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.765975 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.775656 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.785848 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.801502 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.811406 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.821928 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 13:25:47 crc kubenswrapper[4804]: W0217 13:25:47.837062 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-6c927af091ae30d91a1acecfccd94615057a900e7971c3f966ec4a9dcd0bc3f3 WatchSource:0}: Error finding container 6c927af091ae30d91a1acecfccd94615057a900e7971c3f966ec4a9dcd0bc3f3: Status 404 returned error can't find the container with id 6c927af091ae30d91a1acecfccd94615057a900e7971c3f966ec4a9dcd0bc3f3 Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.839368 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.848965 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 13:25:47 crc kubenswrapper[4804]: W0217 13:25:47.867431 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-31449f902d369e7839414ebdb4443b81c5d14622c2f5521174a7595d71bcefe3 WatchSource:0}: Error finding container 31449f902d369e7839414ebdb4443b81c5d14622c2f5521174a7595d71bcefe3: Status 404 returned error can't find the container with id 31449f902d369e7839414ebdb4443b81c5d14622c2f5521174a7595d71bcefe3 Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.924376 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"31449f902d369e7839414ebdb4443b81c5d14622c2f5521174a7595d71bcefe3"} Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.924856 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"17f5c3907bc12fd9996f7cdea92a1898f7ca89896d2057e34075d2373e742fc4"} Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.930155 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"6c927af091ae30d91a1acecfccd94615057a900e7971c3f966ec4a9dcd0bc3f3"} Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.931870 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.934047 4804 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533" exitCode=255 Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.934114 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533"} Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.948410 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.961819 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.961889 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-zb7c5"] Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.963639 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.965362 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.965393 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.966406 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.966659 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.966944 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.974133 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4fbbv" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.974517 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:47 crc kubenswrapper[4804]: I0217 13:25:47.992702 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.006074 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.018947 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.036029 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.041272 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvrlj\" (UniqueName: \"kubernetes.io/projected/6992e22f-b963-46fc-ac41-4ca9938dda85-kube-api-access-jvrlj\") pod \"machine-config-daemon-zb7c5\" (UID: \"6992e22f-b963-46fc-ac41-4ca9938dda85\") " pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.041367 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6992e22f-b963-46fc-ac41-4ca9938dda85-rootfs\") pod \"machine-config-daemon-zb7c5\" (UID: \"6992e22f-b963-46fc-ac41-4ca9938dda85\") " pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.042272 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6992e22f-b963-46fc-ac41-4ca9938dda85-proxy-tls\") pod \"machine-config-daemon-zb7c5\" (UID: \"6992e22f-b963-46fc-ac41-4ca9938dda85\") " pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.042350 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6992e22f-b963-46fc-ac41-4ca9938dda85-mcd-auth-proxy-config\") pod \"machine-config-daemon-zb7c5\" (UID: \"6992e22f-b963-46fc-ac41-4ca9938dda85\") " pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.047132 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.057753 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.075618 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.084684 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.085776 4804 scope.go:117] "RemoveContainer" containerID="f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.087065 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.099837 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.114006 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.126279 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.140862 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.142738 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.142811 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.142844 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6992e22f-b963-46fc-ac41-4ca9938dda85-rootfs\") pod \"machine-config-daemon-zb7c5\" (UID: \"6992e22f-b963-46fc-ac41-4ca9938dda85\") " pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.142895 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6992e22f-b963-46fc-ac41-4ca9938dda85-rootfs\") pod \"machine-config-daemon-zb7c5\" (UID: \"6992e22f-b963-46fc-ac41-4ca9938dda85\") " pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:25:48 crc kubenswrapper[4804]: E0217 13:25:48.142938 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:25:49.142901571 +0000 UTC m=+23.254320908 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:25:48 crc kubenswrapper[4804]: E0217 13:25:48.143102 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.143174 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvrlj\" (UniqueName: \"kubernetes.io/projected/6992e22f-b963-46fc-ac41-4ca9938dda85-kube-api-access-jvrlj\") pod \"machine-config-daemon-zb7c5\" (UID: \"6992e22f-b963-46fc-ac41-4ca9938dda85\") " pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.143260 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.143287 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6992e22f-b963-46fc-ac41-4ca9938dda85-proxy-tls\") pod \"machine-config-daemon-zb7c5\" (UID: \"6992e22f-b963-46fc-ac41-4ca9938dda85\") " pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:25:48 crc kubenswrapper[4804]: E0217 13:25:48.143194 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.143326 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:25:48 crc kubenswrapper[4804]: E0217 13:25:48.143345 4804 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:25:48 crc kubenswrapper[4804]: E0217 13:25:48.143412 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 13:25:49.143390767 +0000 UTC m=+23.254810254 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.143348 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6992e22f-b963-46fc-ac41-4ca9938dda85-mcd-auth-proxy-config\") pod \"machine-config-daemon-zb7c5\" (UID: \"6992e22f-b963-46fc-ac41-4ca9938dda85\") " pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:25:48 crc kubenswrapper[4804]: E0217 13:25:48.143471 4804 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 13:25:48 crc kubenswrapper[4804]: E0217 13:25:48.143540 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 13:25:49.143529002 +0000 UTC m=+23.254948339 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.143475 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:25:48 crc kubenswrapper[4804]: E0217 13:25:48.143609 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 13:25:48 crc kubenswrapper[4804]: E0217 13:25:48.143639 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 13:25:48 crc kubenswrapper[4804]: E0217 13:25:48.143639 4804 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 13:25:48 crc kubenswrapper[4804]: E0217 13:25:48.143656 4804 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:25:48 crc kubenswrapper[4804]: E0217 13:25:48.143706 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 13:25:49.143692907 +0000 UTC m=+23.255112424 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 13:25:48 crc kubenswrapper[4804]: E0217 13:25:48.143730 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 13:25:49.143722038 +0000 UTC m=+23.255141605 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.144674 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6992e22f-b963-46fc-ac41-4ca9938dda85-mcd-auth-proxy-config\") pod \"machine-config-daemon-zb7c5\" (UID: \"6992e22f-b963-46fc-ac41-4ca9938dda85\") " pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.149037 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6992e22f-b963-46fc-ac41-4ca9938dda85-proxy-tls\") pod \"machine-config-daemon-zb7c5\" (UID: \"6992e22f-b963-46fc-ac41-4ca9938dda85\") " pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.164234 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvrlj\" (UniqueName: \"kubernetes.io/projected/6992e22f-b963-46fc-ac41-4ca9938dda85-kube-api-access-jvrlj\") pod \"machine-config-daemon-zb7c5\" (UID: \"6992e22f-b963-46fc-ac41-4ca9938dda85\") " pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.283614 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:25:48 crc kubenswrapper[4804]: W0217 13:25:48.304781 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6992e22f_b963_46fc_ac41_4ca9938dda85.slice/crio-c1f313cdb10593b38b07f0d1c97da5357eafea9076297ff6c32bb50afc7727db WatchSource:0}: Error finding container c1f313cdb10593b38b07f0d1c97da5357eafea9076297ff6c32bb50afc7727db: Status 404 returned error can't find the container with id c1f313cdb10593b38b07f0d1c97da5357eafea9076297ff6c32bb50afc7727db Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.350364 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-kclvs"] Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.350746 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.358310 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-4q55t"] Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.359895 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.362351 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.362592 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.362625 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.362914 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.363413 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.364002 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.364496 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.374054 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.394338 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.410712 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.434434 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.446726 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/526d243d-907b-44f6-a601-de8e86515a3c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.446816 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/526d243d-907b-44f6-a601-de8e86515a3c-system-cni-dir\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.446840 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/526d243d-907b-44f6-a601-de8e86515a3c-cnibin\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.446881 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-cnibin\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.446904 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-host-run-k8s-cni-cncf-io\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.446926 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-multus-cni-dir\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.446971 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-multus-conf-dir\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.446997 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/526d243d-907b-44f6-a601-de8e86515a3c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.447036 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-system-cni-dir\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.447063 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-host-run-netns\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.447101 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/526d243d-907b-44f6-a601-de8e86515a3c-os-release\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.447123 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/526d243d-907b-44f6-a601-de8e86515a3c-cni-binary-copy\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.447144 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/42eec48d-c990-43e6-8348-d9f78997ec3b-multus-daemon-config\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.447215 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-host-var-lib-cni-multus\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.447240 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-etc-kubernetes\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.447270 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-os-release\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.447313 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-host-var-lib-cni-bin\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.447339 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-multus-socket-dir-parent\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.447375 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-host-var-lib-kubelet\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.447396 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-host-run-multus-certs\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.447420 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-hostroot\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.447458 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tnx7\" (UniqueName: \"kubernetes.io/projected/526d243d-907b-44f6-a601-de8e86515a3c-kube-api-access-5tnx7\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.447491 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/42eec48d-c990-43e6-8348-d9f78997ec3b-cni-binary-copy\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.447536 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvgkw\" (UniqueName: \"kubernetes.io/projected/42eec48d-c990-43e6-8348-d9f78997ec3b-kube-api-access-rvgkw\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.448263 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.462477 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.478351 4804 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-17 13:20:47 +0000 UTC, rotation deadline is 2026-12-30 17:08:56.707259972 +0000 UTC Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.478433 4804 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7587h43m8.228833529s for next certificate rotation Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.480490 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.491409 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.505050 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.517666 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.527455 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 08:57:55.052907176 +0000 UTC Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.534657 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.544958 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.548717 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/42eec48d-c990-43e6-8348-d9f78997ec3b-cni-binary-copy\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.548769 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvgkw\" (UniqueName: \"kubernetes.io/projected/42eec48d-c990-43e6-8348-d9f78997ec3b-kube-api-access-rvgkw\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.548802 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/526d243d-907b-44f6-a601-de8e86515a3c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.548830 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/526d243d-907b-44f6-a601-de8e86515a3c-cnibin\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.548877 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/526d243d-907b-44f6-a601-de8e86515a3c-system-cni-dir\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.548906 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-cnibin\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.548939 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-host-run-k8s-cni-cncf-io\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.548971 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-multus-cni-dir\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.549004 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/526d243d-907b-44f6-a601-de8e86515a3c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.549063 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-multus-conf-dir\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.549090 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/526d243d-907b-44f6-a601-de8e86515a3c-cni-binary-copy\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.549117 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-system-cni-dir\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.549143 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-host-run-netns\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.549172 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/526d243d-907b-44f6-a601-de8e86515a3c-os-release\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.549221 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-host-var-lib-cni-multus\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.549249 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/42eec48d-c990-43e6-8348-d9f78997ec3b-multus-daemon-config\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.549290 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-etc-kubernetes\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.549327 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-os-release\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.549353 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-host-var-lib-cni-bin\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.549384 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-multus-socket-dir-parent\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.549411 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-host-var-lib-kubelet\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.549437 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-host-run-multus-certs\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.549465 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-hostroot\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.549490 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tnx7\" (UniqueName: \"kubernetes.io/projected/526d243d-907b-44f6-a601-de8e86515a3c-kube-api-access-5tnx7\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.549552 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/42eec48d-c990-43e6-8348-d9f78997ec3b-cni-binary-copy\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.549696 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-system-cni-dir\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.549953 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-host-run-k8s-cni-cncf-io\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.549992 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/526d243d-907b-44f6-a601-de8e86515a3c-cnibin\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.550041 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-host-var-lib-kubelet\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.550065 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-host-run-multus-certs\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.550077 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-multus-conf-dir\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.550185 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-host-run-netns\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.550235 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-etc-kubernetes\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.550247 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/526d243d-907b-44f6-a601-de8e86515a3c-system-cni-dir\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.550280 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-host-var-lib-cni-multus\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.550263 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-multus-socket-dir-parent\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.550270 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-multus-cni-dir\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.550330 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-hostroot\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.550308 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-cnibin\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.550201 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-host-var-lib-cni-bin\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.550415 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/526d243d-907b-44f6-a601-de8e86515a3c-os-release\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.550505 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/526d243d-907b-44f6-a601-de8e86515a3c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.550820 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/526d243d-907b-44f6-a601-de8e86515a3c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.550835 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/42eec48d-c990-43e6-8348-d9f78997ec3b-multus-daemon-config\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.550876 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/42eec48d-c990-43e6-8348-d9f78997ec3b-os-release\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.551017 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/526d243d-907b-44f6-a601-de8e86515a3c-cni-binary-copy\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.558574 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.567573 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tnx7\" (UniqueName: \"kubernetes.io/projected/526d243d-907b-44f6-a601-de8e86515a3c-kube-api-access-5tnx7\") pod \"multus-additional-cni-plugins-4q55t\" (UID: \"526d243d-907b-44f6-a601-de8e86515a3c\") " pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.567702 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvgkw\" (UniqueName: \"kubernetes.io/projected/42eec48d-c990-43e6-8348-d9f78997ec3b-kube-api-access-rvgkw\") pod \"multus-kclvs\" (UID: \"42eec48d-c990-43e6-8348-d9f78997ec3b\") " pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.574847 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.578364 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.578983 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.580371 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.581074 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.582386 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.583377 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.584274 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.585631 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.585794 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.586550 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.587633 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.588399 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.589408 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.590573 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.591259 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.592451 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.593081 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.594973 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.595988 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.596928 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.597760 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.598296 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.599434 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.599865 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.601087 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.601546 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.602636 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.602996 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.603339 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.606230 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.607042 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.608027 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.608553 4804 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.608662 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.610654 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.611637 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.612048 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.613660 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.614830 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.615417 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.615495 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.616563 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.617394 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.619302 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.619890 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.621039 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.621653 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.622545 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.623101 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.624144 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.624900 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.625834 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.626309 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.627167 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.627790 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.628368 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.629253 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.631659 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.644628 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.655979 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.666840 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.679192 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kclvs" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.686795 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4q55t" Feb 17 13:25:48 crc kubenswrapper[4804]: W0217 13:25:48.709023 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod526d243d_907b_44f6_a601_de8e86515a3c.slice/crio-8782643a2ab2b2033d39939a3f3fee58be5537e2fb1af1f6086b060414e525d1 WatchSource:0}: Error finding container 8782643a2ab2b2033d39939a3f3fee58be5537e2fb1af1f6086b060414e525d1: Status 404 returned error can't find the container with id 8782643a2ab2b2033d39939a3f3fee58be5537e2fb1af1f6086b060414e525d1 Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.715516 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-v8mv6"] Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.716445 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.719410 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.719588 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.719754 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.719966 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.720125 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.720284 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.721751 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.745137 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:48Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.751582 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-systemd-units\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.751618 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-run-systemd\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.751633 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8df4e52a-e578-472b-a6b3-418e9755714f-ovn-node-metrics-cert\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.751652 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-run-openvswitch\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.751666 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-cni-netd\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.751690 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-etc-openvswitch\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.751703 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-run-ovn\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.751719 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-log-socket\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.751743 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-slash\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.751758 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-run-ovn-kubernetes\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.751775 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-kubelet\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.751789 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-run-netns\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.751803 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-var-lib-openvswitch\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.751816 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-node-log\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.751839 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.751856 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8df4e52a-e578-472b-a6b3-418e9755714f-ovnkube-config\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.751871 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8df4e52a-e578-472b-a6b3-418e9755714f-ovnkube-script-lib\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.751885 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-cni-bin\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.751899 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8df4e52a-e578-472b-a6b3-418e9755714f-env-overrides\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.751915 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75nhb\" (UniqueName: \"kubernetes.io/projected/8df4e52a-e578-472b-a6b3-418e9755714f-kube-api-access-75nhb\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.763597 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:48Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.780327 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:48Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.793651 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:48Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.812708 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:48Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.827480 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:48Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.846559 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:48Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.852610 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-cni-bin\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.852638 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8df4e52a-e578-472b-a6b3-418e9755714f-env-overrides\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.852659 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75nhb\" (UniqueName: \"kubernetes.io/projected/8df4e52a-e578-472b-a6b3-418e9755714f-kube-api-access-75nhb\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.852704 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-systemd-units\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.852721 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8df4e52a-e578-472b-a6b3-418e9755714f-ovn-node-metrics-cert\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.852736 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-run-systemd\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.852756 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-cni-netd\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.852777 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-run-openvswitch\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.852800 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-run-ovn\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.852814 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-log-socket\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.852828 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-etc-openvswitch\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.852850 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-slash\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.852863 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-run-ovn-kubernetes\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.852881 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-kubelet\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.852899 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-run-netns\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.852914 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-var-lib-openvswitch\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.852929 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-node-log\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.852957 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.852974 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8df4e52a-e578-472b-a6b3-418e9755714f-ovnkube-config\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.852994 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8df4e52a-e578-472b-a6b3-418e9755714f-ovnkube-script-lib\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.853741 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8df4e52a-e578-472b-a6b3-418e9755714f-ovnkube-script-lib\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.853794 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-cni-bin\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.854092 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8df4e52a-e578-472b-a6b3-418e9755714f-env-overrides\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.854318 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-systemd-units\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.854911 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-var-lib-openvswitch\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.854955 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-run-netns\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.854969 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-slash\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.854994 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-run-ovn-kubernetes\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.855020 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-cni-netd\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.855002 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-run-systemd\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.855049 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-run-ovn\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.855062 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-etc-openvswitch\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.855039 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-kubelet\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.855080 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.855105 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-node-log\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.855145 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-log-socket\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.854984 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-run-openvswitch\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.856056 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8df4e52a-e578-472b-a6b3-418e9755714f-ovnkube-config\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.861038 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8df4e52a-e578-472b-a6b3-418e9755714f-ovn-node-metrics-cert\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.863441 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:48Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.874244 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75nhb\" (UniqueName: \"kubernetes.io/projected/8df4e52a-e578-472b-a6b3-418e9755714f-kube-api-access-75nhb\") pod \"ovnkube-node-v8mv6\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.885991 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:48Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.902492 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:48Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.916091 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:48Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.931319 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:48Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.939235 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerStarted","Data":"049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a"} Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.939298 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerStarted","Data":"526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b"} Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.939313 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerStarted","Data":"c1f313cdb10593b38b07f0d1c97da5357eafea9076297ff6c32bb50afc7727db"} Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.941130 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.945074 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81"} Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.945911 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.963598 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kclvs" event={"ID":"42eec48d-c990-43e6-8348-d9f78997ec3b","Type":"ContainerStarted","Data":"26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa"} Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.963665 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kclvs" event={"ID":"42eec48d-c990-43e6-8348-d9f78997ec3b","Type":"ContainerStarted","Data":"6c116c27328e56e79548ef582b32338297cd7d0dc0365e613e80b60106d64f54"} Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.967178 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd"} Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.967252 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b"} Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.969402 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a"} Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.971168 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" event={"ID":"526d243d-907b-44f6-a601-de8e86515a3c","Type":"ContainerStarted","Data":"8782643a2ab2b2033d39939a3f3fee58be5537e2fb1af1f6086b060414e525d1"} Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.972948 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4fbbv" event={"ID":"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc","Type":"ContainerStarted","Data":"9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6"} Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.972983 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4fbbv" event={"ID":"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc","Type":"ContainerStarted","Data":"acdd4ca1bd0f1e15b1086bca190cfff38491e5cb8e7e682e54bb3fb9aa4a2aec"} Feb 17 13:25:48 crc kubenswrapper[4804]: I0217 13:25:48.973475 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:48Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.012856 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.044912 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.054802 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.098406 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.131499 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.161083 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.161257 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:25:49 crc kubenswrapper[4804]: E0217 13:25:49.161294 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:25:51.161263315 +0000 UTC m=+25.272682672 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.161344 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:25:49 crc kubenswrapper[4804]: E0217 13:25:49.161424 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 13:25:49 crc kubenswrapper[4804]: E0217 13:25:49.161451 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.161452 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:25:49 crc kubenswrapper[4804]: E0217 13:25:49.161465 4804 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.161493 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:25:49 crc kubenswrapper[4804]: E0217 13:25:49.161559 4804 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 13:25:49 crc kubenswrapper[4804]: E0217 13:25:49.161584 4804 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 13:25:49 crc kubenswrapper[4804]: E0217 13:25:49.161618 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 13:25:49 crc kubenswrapper[4804]: E0217 13:25:49.161634 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 13:25:49 crc kubenswrapper[4804]: E0217 13:25:49.161677 4804 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:25:49 crc kubenswrapper[4804]: E0217 13:25:49.161621 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 13:25:51.161600986 +0000 UTC m=+25.273020323 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 13:25:49 crc kubenswrapper[4804]: E0217 13:25:49.161713 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 13:25:51.161705889 +0000 UTC m=+25.273125226 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:25:49 crc kubenswrapper[4804]: E0217 13:25:49.161728 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 13:25:51.16172162 +0000 UTC m=+25.273140957 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 13:25:49 crc kubenswrapper[4804]: E0217 13:25:49.161980 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 13:25:51.161970898 +0000 UTC m=+25.273390235 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.176145 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.215907 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.252418 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.292604 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.333148 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.376852 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.437482 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.451175 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.527680 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 19:47:25.287924316 +0000 UTC Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.528084 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.547609 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.572956 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:25:49 crc kubenswrapper[4804]: E0217 13:25:49.573105 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.573537 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:25:49 crc kubenswrapper[4804]: E0217 13:25:49.573598 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.573673 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:25:49 crc kubenswrapper[4804]: E0217 13:25:49.573716 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.592893 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.621460 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.657866 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.699812 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.737984 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.772454 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.815346 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.859853 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.894481 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.977245 4804 generic.go:334] "Generic (PLEG): container finished" podID="8df4e52a-e578-472b-a6b3-418e9755714f" containerID="cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44" exitCode=0 Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.977380 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerDied","Data":"cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44"} Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.977472 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerStarted","Data":"a80f9c965ade76b1702626786407637ac7c475f156f06af4c297248b43c44248"} Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.978882 4804 generic.go:334] "Generic (PLEG): container finished" podID="526d243d-907b-44f6-a601-de8e86515a3c" containerID="3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f" exitCode=0 Feb 17 13:25:49 crc kubenswrapper[4804]: I0217 13:25:49.978984 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" event={"ID":"526d243d-907b-44f6-a601-de8e86515a3c","Type":"ContainerDied","Data":"3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f"} Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.003076 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:49Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.030702 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.047813 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.063876 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.092986 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.134041 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.172828 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.214353 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.256898 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.273941 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.278430 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.293178 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.311392 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.352734 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.392869 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.435374 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.474304 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.514466 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.529092 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 11:20:01.320226111 +0000 UTC Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.560245 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.596319 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.631733 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.646669 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-z522z"] Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.647046 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-z522z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.680550 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.687105 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.704278 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.724656 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.744428 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.777640 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7d0b53df-b6de-4c33-a429-560638368e6c-serviceca\") pod \"node-ca-z522z\" (UID: \"7d0b53df-b6de-4c33-a429-560638368e6c\") " pod="openshift-image-registry/node-ca-z522z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.777744 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7d0b53df-b6de-4c33-a429-560638368e6c-host\") pod \"node-ca-z522z\" (UID: \"7d0b53df-b6de-4c33-a429-560638368e6c\") " pod="openshift-image-registry/node-ca-z522z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.777789 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d4nd\" (UniqueName: \"kubernetes.io/projected/7d0b53df-b6de-4c33-a429-560638368e6c-kube-api-access-8d4nd\") pod \"node-ca-z522z\" (UID: \"7d0b53df-b6de-4c33-a429-560638368e6c\") " pod="openshift-image-registry/node-ca-z522z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.801333 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.834214 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.872475 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.879269 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7d0b53df-b6de-4c33-a429-560638368e6c-host\") pod \"node-ca-z522z\" (UID: \"7d0b53df-b6de-4c33-a429-560638368e6c\") " pod="openshift-image-registry/node-ca-z522z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.879349 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d4nd\" (UniqueName: \"kubernetes.io/projected/7d0b53df-b6de-4c33-a429-560638368e6c-kube-api-access-8d4nd\") pod \"node-ca-z522z\" (UID: \"7d0b53df-b6de-4c33-a429-560638368e6c\") " pod="openshift-image-registry/node-ca-z522z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.879433 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7d0b53df-b6de-4c33-a429-560638368e6c-serviceca\") pod \"node-ca-z522z\" (UID: \"7d0b53df-b6de-4c33-a429-560638368e6c\") " pod="openshift-image-registry/node-ca-z522z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.879446 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7d0b53df-b6de-4c33-a429-560638368e6c-host\") pod \"node-ca-z522z\" (UID: \"7d0b53df-b6de-4c33-a429-560638368e6c\") " pod="openshift-image-registry/node-ca-z522z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.881455 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7d0b53df-b6de-4c33-a429-560638368e6c-serviceca\") pod \"node-ca-z522z\" (UID: \"7d0b53df-b6de-4c33-a429-560638368e6c\") " pod="openshift-image-registry/node-ca-z522z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.926650 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d4nd\" (UniqueName: \"kubernetes.io/projected/7d0b53df-b6de-4c33-a429-560638368e6c-kube-api-access-8d4nd\") pod \"node-ca-z522z\" (UID: \"7d0b53df-b6de-4c33-a429-560638368e6c\") " pod="openshift-image-registry/node-ca-z522z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.933325 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.973939 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:50Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.987030 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerStarted","Data":"cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054"} Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.987074 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerStarted","Data":"d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb"} Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.987084 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerStarted","Data":"6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7"} Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.987095 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerStarted","Data":"94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb"} Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.987105 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerStarted","Data":"8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449"} Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.987114 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerStarted","Data":"6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79"} Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.988365 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b"} Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.990431 4804 generic.go:334] "Generic (PLEG): container finished" podID="526d243d-907b-44f6-a601-de8e86515a3c" containerID="471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144" exitCode=0 Feb 17 13:25:50 crc kubenswrapper[4804]: I0217 13:25:50.991009 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" event={"ID":"526d243d-907b-44f6-a601-de8e86515a3c","Type":"ContainerDied","Data":"471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144"} Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.011339 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:51Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.021633 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-z522z" Feb 17 13:25:51 crc kubenswrapper[4804]: W0217 13:25:51.041519 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d0b53df_b6de_4c33_a429_560638368e6c.slice/crio-797d47d823a56e165831c8a1c52730a61f7bfb6c709c1d1ad8b98b41912ede6b WatchSource:0}: Error finding container 797d47d823a56e165831c8a1c52730a61f7bfb6c709c1d1ad8b98b41912ede6b: Status 404 returned error can't find the container with id 797d47d823a56e165831c8a1c52730a61f7bfb6c709c1d1ad8b98b41912ede6b Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.050590 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:51Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.095403 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:51Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.133703 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:51Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.175189 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:51Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.182808 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.182935 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.182968 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:25:51 crc kubenswrapper[4804]: E0217 13:25:51.183062 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:25:55.183025834 +0000 UTC m=+29.294445231 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:25:51 crc kubenswrapper[4804]: E0217 13:25:51.183097 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 13:25:51 crc kubenswrapper[4804]: E0217 13:25:51.183117 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 13:25:51 crc kubenswrapper[4804]: E0217 13:25:51.183129 4804 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:25:51 crc kubenswrapper[4804]: E0217 13:25:51.183181 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 13:25:55.183164158 +0000 UTC m=+29.294583495 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.183175 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.183242 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:25:51 crc kubenswrapper[4804]: E0217 13:25:51.183300 4804 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 13:25:51 crc kubenswrapper[4804]: E0217 13:25:51.183327 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 13:25:51 crc kubenswrapper[4804]: E0217 13:25:51.183346 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 13:25:51 crc kubenswrapper[4804]: E0217 13:25:51.183364 4804 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:25:51 crc kubenswrapper[4804]: E0217 13:25:51.183327 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 13:25:55.183319953 +0000 UTC m=+29.294739290 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 13:25:51 crc kubenswrapper[4804]: E0217 13:25:51.183426 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 13:25:55.183419067 +0000 UTC m=+29.294838404 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:25:51 crc kubenswrapper[4804]: E0217 13:25:51.183840 4804 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 13:25:51 crc kubenswrapper[4804]: E0217 13:25:51.183905 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 13:25:55.183887582 +0000 UTC m=+29.295306919 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.212837 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:51Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.255122 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:51Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.292465 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:51Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.333647 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:51Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.376494 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:51Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.412044 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:51Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.456826 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:51Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.494124 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:51Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.529611 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 09:59:25.176300906 +0000 UTC Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.534067 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:51Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.573694 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.573808 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:25:51 crc kubenswrapper[4804]: E0217 13:25:51.573853 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.573997 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:25:51 crc kubenswrapper[4804]: E0217 13:25:51.574062 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:25:51 crc kubenswrapper[4804]: E0217 13:25:51.574236 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.578924 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:51Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.996655 4804 generic.go:334] "Generic (PLEG): container finished" podID="526d243d-907b-44f6-a601-de8e86515a3c" containerID="dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29" exitCode=0 Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.997012 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" event={"ID":"526d243d-907b-44f6-a601-de8e86515a3c","Type":"ContainerDied","Data":"dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29"} Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.998871 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-z522z" event={"ID":"7d0b53df-b6de-4c33-a429-560638368e6c","Type":"ContainerStarted","Data":"0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd"} Feb 17 13:25:51 crc kubenswrapper[4804]: I0217 13:25:51.998999 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-z522z" event={"ID":"7d0b53df-b6de-4c33-a429-560638368e6c","Type":"ContainerStarted","Data":"797d47d823a56e165831c8a1c52730a61f7bfb6c709c1d1ad8b98b41912ede6b"} Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.026750 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.045862 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.062121 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.077890 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.095377 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.117393 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.130035 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.143050 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.162973 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.189882 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.204610 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.217004 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.228250 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.241225 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.254387 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.265887 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.279904 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.295634 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.334286 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.373659 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.410044 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.450949 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.490309 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.530104 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 10:09:51.513226531 +0000 UTC Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.531891 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.570545 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.609570 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.651338 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:52 crc kubenswrapper[4804]: I0217 13:25:52.693746 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:52Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.007829 4804 generic.go:334] "Generic (PLEG): container finished" podID="526d243d-907b-44f6-a601-de8e86515a3c" containerID="33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02" exitCode=0 Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.007896 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" event={"ID":"526d243d-907b-44f6-a601-de8e86515a3c","Type":"ContainerDied","Data":"33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02"} Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.014428 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerStarted","Data":"d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd"} Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.020704 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:53Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.032905 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:53Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.046671 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:53Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.062700 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:53Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.074262 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:53Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.085075 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:53Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.095483 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:53Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.107968 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:53Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.120185 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:53Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.132736 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:53Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.142915 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:53Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.176317 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:53Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.214636 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:53Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.253656 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:53Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.530397 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 05:43:58.390528444 +0000 UTC Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.573535 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.573606 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.573555 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:25:53 crc kubenswrapper[4804]: E0217 13:25:53.573826 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:25:53 crc kubenswrapper[4804]: E0217 13:25:53.573996 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:25:53 crc kubenswrapper[4804]: E0217 13:25:53.574460 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.830102 4804 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.832313 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.832377 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.832395 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.832582 4804 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.842404 4804 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.842615 4804 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.843817 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.843884 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.843898 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.843925 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.843940 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:53Z","lastTransitionTime":"2026-02-17T13:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:53 crc kubenswrapper[4804]: E0217 13:25:53.856262 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:53Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.860405 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.860441 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.860450 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.860469 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.860482 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:53Z","lastTransitionTime":"2026-02-17T13:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:53 crc kubenswrapper[4804]: E0217 13:25:53.872699 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:53Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.877358 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.877412 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.877426 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.877487 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.877525 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:53Z","lastTransitionTime":"2026-02-17T13:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:53 crc kubenswrapper[4804]: E0217 13:25:53.896391 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:53Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.902657 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.902686 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.902694 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.902711 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.902722 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:53Z","lastTransitionTime":"2026-02-17T13:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:53 crc kubenswrapper[4804]: E0217 13:25:53.920304 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:53Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.924567 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.924602 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.924613 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.924633 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.924644 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:53Z","lastTransitionTime":"2026-02-17T13:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:53 crc kubenswrapper[4804]: E0217 13:25:53.936710 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:53Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:53 crc kubenswrapper[4804]: E0217 13:25:53.936818 4804 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.938590 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.938665 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.938692 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.938726 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:53 crc kubenswrapper[4804]: I0217 13:25:53.938748 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:53Z","lastTransitionTime":"2026-02-17T13:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.021536 4804 generic.go:334] "Generic (PLEG): container finished" podID="526d243d-907b-44f6-a601-de8e86515a3c" containerID="f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d" exitCode=0 Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.021585 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" event={"ID":"526d243d-907b-44f6-a601-de8e86515a3c","Type":"ContainerDied","Data":"f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d"} Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.042067 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:54Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.042580 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.042636 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.042659 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.042689 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.042711 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:54Z","lastTransitionTime":"2026-02-17T13:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.055307 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:54Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.067671 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:54Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.084006 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:54Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.097859 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:54Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.111581 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:54Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.123647 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:54Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.137918 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:54Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.145472 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.145522 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.145535 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.145557 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.145584 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:54Z","lastTransitionTime":"2026-02-17T13:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.153254 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:54Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.169840 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:54Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.193408 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:54Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.231092 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:54Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.248940 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.248987 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.248997 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.249018 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.249029 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:54Z","lastTransitionTime":"2026-02-17T13:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.263715 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:54Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.283518 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:54Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.351778 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.351817 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.351828 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.351843 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.351853 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:54Z","lastTransitionTime":"2026-02-17T13:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.454796 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.454833 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.454842 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.454858 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.454868 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:54Z","lastTransitionTime":"2026-02-17T13:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.531328 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 09:46:06.568427921 +0000 UTC Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.558300 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.558348 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.558359 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.558377 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.558386 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:54Z","lastTransitionTime":"2026-02-17T13:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.662478 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.662551 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.662575 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.662610 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.662635 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:54Z","lastTransitionTime":"2026-02-17T13:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.765297 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.765394 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.765430 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.765470 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.765494 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:54Z","lastTransitionTime":"2026-02-17T13:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.869724 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.869798 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.869827 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.869863 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.869891 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:54Z","lastTransitionTime":"2026-02-17T13:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.973842 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.973929 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.973951 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.973983 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:54 crc kubenswrapper[4804]: I0217 13:25:54.974003 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:54Z","lastTransitionTime":"2026-02-17T13:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.029856 4804 generic.go:334] "Generic (PLEG): container finished" podID="526d243d-907b-44f6-a601-de8e86515a3c" containerID="2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a" exitCode=0 Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.029912 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" event={"ID":"526d243d-907b-44f6-a601-de8e86515a3c","Type":"ContainerDied","Data":"2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a"} Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.046253 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:55Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.061622 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:55Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.077631 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.077685 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.077702 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.077722 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.077736 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:55Z","lastTransitionTime":"2026-02-17T13:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.079179 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:55Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.100188 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:55Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.119253 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:55Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.134131 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:55Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.146999 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:55Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.161728 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:55Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.179904 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:55Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.180602 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.180639 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.180653 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.180671 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.180685 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:55Z","lastTransitionTime":"2026-02-17T13:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.194601 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:55Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.207871 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:55Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.223286 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:55Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.228046 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.228189 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:25:55 crc kubenswrapper[4804]: E0217 13:25:55.228230 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:26:03.228210125 +0000 UTC m=+37.339629462 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.228258 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.228306 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.228341 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:25:55 crc kubenswrapper[4804]: E0217 13:25:55.228374 4804 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 13:25:55 crc kubenswrapper[4804]: E0217 13:25:55.228428 4804 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 13:25:55 crc kubenswrapper[4804]: E0217 13:25:55.228476 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 13:26:03.228449753 +0000 UTC m=+37.339869090 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 13:25:55 crc kubenswrapper[4804]: E0217 13:25:55.228491 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 13:25:55 crc kubenswrapper[4804]: E0217 13:25:55.228527 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 13:25:55 crc kubenswrapper[4804]: E0217 13:25:55.228540 4804 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:25:55 crc kubenswrapper[4804]: E0217 13:25:55.228545 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 13:25:55 crc kubenswrapper[4804]: E0217 13:25:55.228563 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 13:25:55 crc kubenswrapper[4804]: E0217 13:25:55.228575 4804 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:25:55 crc kubenswrapper[4804]: E0217 13:25:55.228505 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 13:26:03.228496635 +0000 UTC m=+37.339915972 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 13:25:55 crc kubenswrapper[4804]: E0217 13:25:55.228632 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 13:26:03.228611999 +0000 UTC m=+37.340031526 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:25:55 crc kubenswrapper[4804]: E0217 13:25:55.228667 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 13:26:03.22866025 +0000 UTC m=+37.340079587 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.239596 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:55Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.259465 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:55Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.283384 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.283434 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.283450 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.283470 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.283484 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:55Z","lastTransitionTime":"2026-02-17T13:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.386324 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.386382 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.386394 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.386411 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.386422 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:55Z","lastTransitionTime":"2026-02-17T13:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.489353 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.489407 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.489418 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.489442 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.489454 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:55Z","lastTransitionTime":"2026-02-17T13:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.531572 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 12:23:24.405567501 +0000 UTC Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.573843 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.574053 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:25:55 crc kubenswrapper[4804]: E0217 13:25:55.574149 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:25:55 crc kubenswrapper[4804]: E0217 13:25:55.574356 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.574453 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:25:55 crc kubenswrapper[4804]: E0217 13:25:55.574609 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.593475 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.593565 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.593592 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.593626 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.593650 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:55Z","lastTransitionTime":"2026-02-17T13:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.697031 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.697087 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.697099 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.697117 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.697129 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:55Z","lastTransitionTime":"2026-02-17T13:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.800304 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.800647 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.800656 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.800670 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.800679 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:55Z","lastTransitionTime":"2026-02-17T13:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.903473 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.903510 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.903518 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.903531 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:55 crc kubenswrapper[4804]: I0217 13:25:55.903541 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:55Z","lastTransitionTime":"2026-02-17T13:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.006374 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.006422 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.006433 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.006451 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.006464 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:56Z","lastTransitionTime":"2026-02-17T13:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.036310 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" event={"ID":"526d243d-907b-44f6-a601-de8e86515a3c","Type":"ContainerStarted","Data":"eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5"} Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.048515 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerStarted","Data":"9ee7fdd23a3c447937ba190cf0b0124318748ff1ff40b8d1be3bc541ddb39963"} Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.048944 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.048983 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.057339 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.075607 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.081887 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.082614 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.091456 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.106294 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.109679 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.109726 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.109743 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.109762 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.109775 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:56Z","lastTransitionTime":"2026-02-17T13:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.118369 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.133970 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.148463 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.163939 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.179839 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.194295 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.211949 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.211989 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.211999 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.212013 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.212022 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:56Z","lastTransitionTime":"2026-02-17T13:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.212795 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.231914 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.254559 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.273165 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.287825 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.301613 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.315021 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.315070 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.315081 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.315098 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.315111 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:56Z","lastTransitionTime":"2026-02-17T13:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.319887 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.340990 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.358941 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.370654 4804 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.372448 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c/status\": read tcp 38.102.83.146:56570->38.102.83.146:6443: use of closed network connection" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.402323 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.416506 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.419537 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.419575 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.419584 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.419604 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.419613 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:56Z","lastTransitionTime":"2026-02-17T13:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.431153 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.450669 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ee7fdd23a3c447937ba190cf0b0124318748ff1ff40b8d1be3bc541ddb39963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.463811 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.476958 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.488638 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.499368 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.522390 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.522422 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.522430 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.522444 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.522453 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:56Z","lastTransitionTime":"2026-02-17T13:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.531761 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 16:34:49.350013425 +0000 UTC Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.588874 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.605714 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ee7fdd23a3c447937ba190cf0b0124318748ff1ff40b8d1be3bc541ddb39963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.618452 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.624332 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.624362 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.624371 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.624386 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.624396 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:56Z","lastTransitionTime":"2026-02-17T13:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.629775 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.640510 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.648537 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.667739 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.682709 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.695159 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.708882 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.723881 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.727492 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.727537 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.727563 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.727584 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.727596 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:56Z","lastTransitionTime":"2026-02-17T13:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.742018 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.754482 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.766376 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.830579 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.830649 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.830670 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.830702 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.830726 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:56Z","lastTransitionTime":"2026-02-17T13:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.934825 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.934928 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.934947 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.934970 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:56 crc kubenswrapper[4804]: I0217 13:25:56.934987 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:56Z","lastTransitionTime":"2026-02-17T13:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.037921 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.037984 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.038001 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.038026 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.038043 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:57Z","lastTransitionTime":"2026-02-17T13:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.053016 4804 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.141677 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.141771 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.141786 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.141806 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.141818 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:57Z","lastTransitionTime":"2026-02-17T13:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.245320 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.245365 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.245377 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.245394 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.245407 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:57Z","lastTransitionTime":"2026-02-17T13:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.348125 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.348189 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.348239 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.348267 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.348285 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:57Z","lastTransitionTime":"2026-02-17T13:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.450811 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.450856 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.450868 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.450883 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.450892 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:57Z","lastTransitionTime":"2026-02-17T13:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.532640 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 00:06:32.157674694 +0000 UTC Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.554009 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.554066 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.554078 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.554098 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.554111 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:57Z","lastTransitionTime":"2026-02-17T13:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.573305 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.573362 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.573393 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:25:57 crc kubenswrapper[4804]: E0217 13:25:57.573451 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:25:57 crc kubenswrapper[4804]: E0217 13:25:57.573570 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:25:57 crc kubenswrapper[4804]: E0217 13:25:57.573750 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.622053 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.656810 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.656860 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.656870 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.656892 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.656905 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:57Z","lastTransitionTime":"2026-02-17T13:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.759403 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.759463 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.759478 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.759502 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.759517 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:57Z","lastTransitionTime":"2026-02-17T13:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.862300 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.862347 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.862360 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.862380 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.862395 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:57Z","lastTransitionTime":"2026-02-17T13:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.965969 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.966011 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.966023 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.966040 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:57 crc kubenswrapper[4804]: I0217 13:25:57.966050 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:57Z","lastTransitionTime":"2026-02-17T13:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.068356 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.068425 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.068444 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.068472 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.068490 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:58Z","lastTransitionTime":"2026-02-17T13:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.171112 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.171197 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.171236 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.171255 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.171266 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:58Z","lastTransitionTime":"2026-02-17T13:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.273431 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.273509 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.273529 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.273560 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.273584 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:58Z","lastTransitionTime":"2026-02-17T13:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.376070 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.376226 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.376252 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.376280 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.376301 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:58Z","lastTransitionTime":"2026-02-17T13:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.479125 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.479236 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.479255 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.479281 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.479302 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:58Z","lastTransitionTime":"2026-02-17T13:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.533438 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 00:56:40.96450051 +0000 UTC Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.581537 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.581661 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.581676 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.581688 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.581698 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:58Z","lastTransitionTime":"2026-02-17T13:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.685069 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.685151 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.685169 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.685239 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.685262 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:58Z","lastTransitionTime":"2026-02-17T13:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.788594 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.788640 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.788651 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.788671 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.788682 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:58Z","lastTransitionTime":"2026-02-17T13:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.890891 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.890958 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.890980 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.891010 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.891033 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:58Z","lastTransitionTime":"2026-02-17T13:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.993968 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.994024 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.994039 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.994057 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:58 crc kubenswrapper[4804]: I0217 13:25:58.994070 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:58Z","lastTransitionTime":"2026-02-17T13:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.060346 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v8mv6_8df4e52a-e578-472b-a6b3-418e9755714f/ovnkube-controller/0.log" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.065798 4804 generic.go:334] "Generic (PLEG): container finished" podID="8df4e52a-e578-472b-a6b3-418e9755714f" containerID="9ee7fdd23a3c447937ba190cf0b0124318748ff1ff40b8d1be3bc541ddb39963" exitCode=1 Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.065837 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerDied","Data":"9ee7fdd23a3c447937ba190cf0b0124318748ff1ff40b8d1be3bc541ddb39963"} Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.066588 4804 scope.go:117] "RemoveContainer" containerID="9ee7fdd23a3c447937ba190cf0b0124318748ff1ff40b8d1be3bc541ddb39963" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.091566 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:59Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.096790 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.096836 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.096848 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.096864 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.096874 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:59Z","lastTransitionTime":"2026-02-17T13:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.107672 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:59Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.122403 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:59Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.135099 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:59Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.147723 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:59Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.160965 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:59Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.172206 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:59Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.184156 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:59Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.198044 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:59Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.200843 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.200881 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.200892 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.200912 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.200923 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:59Z","lastTransitionTime":"2026-02-17T13:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.213633 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:59Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.226009 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:59Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.237351 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:59Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.248007 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:59Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.263998 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ee7fdd23a3c447937ba190cf0b0124318748ff1ff40b8d1be3bc541ddb39963\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ee7fdd23a3c447937ba190cf0b0124318748ff1ff40b8d1be3bc541ddb39963\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:25:58Z\\\",\\\"message\\\":\\\"5:58.145600 6107 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:25:58.145961 6107 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 13:25:58.146060 6107 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:25:58.146122 6107 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:25:58.146233 6107 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 13:25:58.146467 6107 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 13:25:58.146849 6107 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:25:58.147451 6107 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 13:25:58.147524 6107 factory.go:656] Stopping watch factory\\\\nI0217 13:25:58.147547 6107 ovnkube.go:599] Stopped ovnkube\\\\nI0217 13:25:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:25:59Z is after 2025-08-24T17:21:41Z" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.303369 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.303410 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.303423 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.303441 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.303452 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:59Z","lastTransitionTime":"2026-02-17T13:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.406105 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.406151 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.406161 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.406175 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.406185 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:59Z","lastTransitionTime":"2026-02-17T13:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.508487 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.508528 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.508537 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.508552 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.508561 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:59Z","lastTransitionTime":"2026-02-17T13:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.534285 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 22:09:56.281713514 +0000 UTC Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.573728 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.573760 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.573811 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:25:59 crc kubenswrapper[4804]: E0217 13:25:59.573905 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:25:59 crc kubenswrapper[4804]: E0217 13:25:59.573990 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:25:59 crc kubenswrapper[4804]: E0217 13:25:59.574081 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.611529 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.611570 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.611580 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.611599 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.611608 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:59Z","lastTransitionTime":"2026-02-17T13:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.714128 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.714170 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.714179 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.714217 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.714227 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:59Z","lastTransitionTime":"2026-02-17T13:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.816809 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.816849 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.816859 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.816874 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.816883 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:59Z","lastTransitionTime":"2026-02-17T13:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.922144 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.922230 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.922252 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.922271 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:25:59 crc kubenswrapper[4804]: I0217 13:25:59.922284 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:25:59Z","lastTransitionTime":"2026-02-17T13:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.025356 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.025405 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.025413 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.025430 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.025444 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:00Z","lastTransitionTime":"2026-02-17T13:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.074137 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v8mv6_8df4e52a-e578-472b-a6b3-418e9755714f/ovnkube-controller/0.log" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.077974 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerStarted","Data":"95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26"} Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.078352 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.100169 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:00Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.119280 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:00Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.129573 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.129622 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.129638 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.129665 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.129685 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:00Z","lastTransitionTime":"2026-02-17T13:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.139601 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:00Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.209603 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:00Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.224773 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:00Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.232643 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.232674 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.232686 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.232702 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.232712 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:00Z","lastTransitionTime":"2026-02-17T13:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.236031 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:00Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.247294 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:00Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.262118 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:00Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.278131 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:00Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.293491 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:00Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.308134 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:00Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.319486 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:00Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.333791 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:00Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.335635 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.335680 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.335698 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.335724 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.335742 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:00Z","lastTransitionTime":"2026-02-17T13:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.353343 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ee7fdd23a3c447937ba190cf0b0124318748ff1ff40b8d1be3bc541ddb39963\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:25:58Z\\\",\\\"message\\\":\\\"5:58.145600 6107 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:25:58.145961 6107 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 13:25:58.146060 6107 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:25:58.146122 6107 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:25:58.146233 6107 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 13:25:58.146467 6107 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 13:25:58.146849 6107 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:25:58.147451 6107 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 13:25:58.147524 6107 factory.go:656] Stopping watch factory\\\\nI0217 13:25:58.147547 6107 ovnkube.go:599] Stopped ovnkube\\\\nI0217 13:25:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:00Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.439328 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.439385 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.439396 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.439419 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.439436 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:00Z","lastTransitionTime":"2026-02-17T13:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.535035 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 19:42:12.921915644 +0000 UTC Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.542139 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.542185 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.542232 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.542256 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.542273 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:00Z","lastTransitionTime":"2026-02-17T13:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.645006 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.645057 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.645074 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.645099 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.645112 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:00Z","lastTransitionTime":"2026-02-17T13:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.748241 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.748285 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.748297 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.748316 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.748327 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:00Z","lastTransitionTime":"2026-02-17T13:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.850135 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.850180 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.850189 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.850223 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.850235 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:00Z","lastTransitionTime":"2026-02-17T13:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.952211 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.952239 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.952247 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.952261 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:00 crc kubenswrapper[4804]: I0217 13:26:00.952269 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:00Z","lastTransitionTime":"2026-02-17T13:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.040406 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh"] Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.041104 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.043632 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.044384 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.059286 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.059359 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.059392 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.059421 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.059440 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:01Z","lastTransitionTime":"2026-02-17T13:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.061581 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.083315 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ee7fdd23a3c447937ba190cf0b0124318748ff1ff40b8d1be3bc541ddb39963\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:25:58Z\\\",\\\"message\\\":\\\"5:58.145600 6107 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:25:58.145961 6107 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 13:25:58.146060 6107 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:25:58.146122 6107 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:25:58.146233 6107 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 13:25:58.146467 6107 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 13:25:58.146849 6107 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:25:58.147451 6107 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 13:25:58.147524 6107 factory.go:656] Stopping watch factory\\\\nI0217 13:25:58.147547 6107 ovnkube.go:599] Stopped ovnkube\\\\nI0217 13:25:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.084571 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v8mv6_8df4e52a-e578-472b-a6b3-418e9755714f/ovnkube-controller/1.log" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.086021 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v8mv6_8df4e52a-e578-472b-a6b3-418e9755714f/ovnkube-controller/0.log" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.092916 4804 generic.go:334] "Generic (PLEG): container finished" podID="8df4e52a-e578-472b-a6b3-418e9755714f" containerID="95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26" exitCode=1 Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.092999 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerDied","Data":"95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26"} Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.093115 4804 scope.go:117] "RemoveContainer" containerID="9ee7fdd23a3c447937ba190cf0b0124318748ff1ff40b8d1be3bc541ddb39963" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.094690 4804 scope.go:117] "RemoveContainer" containerID="95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26" Feb 17 13:26:01 crc kubenswrapper[4804]: E0217 13:26:01.095110 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.098505 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.115697 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.130751 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.143880 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.158863 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.161294 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.161348 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.161360 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.161382 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.161395 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:01Z","lastTransitionTime":"2026-02-17T13:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.173295 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.189868 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.208865 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.226721 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.227277 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/be1ee3c4-2152-421a-b39c-c1455968a17c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ln7fh\" (UID: \"be1ee3c4-2152-421a-b39c-c1455968a17c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.227304 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/be1ee3c4-2152-421a-b39c-c1455968a17c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ln7fh\" (UID: \"be1ee3c4-2152-421a-b39c-c1455968a17c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.227542 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vd9x\" (UniqueName: \"kubernetes.io/projected/be1ee3c4-2152-421a-b39c-c1455968a17c-kube-api-access-6vd9x\") pod \"ovnkube-control-plane-749d76644c-ln7fh\" (UID: \"be1ee3c4-2152-421a-b39c-c1455968a17c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.227671 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/be1ee3c4-2152-421a-b39c-c1455968a17c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ln7fh\" (UID: \"be1ee3c4-2152-421a-b39c-c1455968a17c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.241242 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.256124 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.264891 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.264930 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.264940 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.264957 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.264970 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:01Z","lastTransitionTime":"2026-02-17T13:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.276503 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.292491 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.313706 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.329884 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/be1ee3c4-2152-421a-b39c-c1455968a17c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ln7fh\" (UID: \"be1ee3c4-2152-421a-b39c-c1455968a17c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.329970 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/be1ee3c4-2152-421a-b39c-c1455968a17c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ln7fh\" (UID: \"be1ee3c4-2152-421a-b39c-c1455968a17c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.330053 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vd9x\" (UniqueName: \"kubernetes.io/projected/be1ee3c4-2152-421a-b39c-c1455968a17c-kube-api-access-6vd9x\") pod \"ovnkube-control-plane-749d76644c-ln7fh\" (UID: \"be1ee3c4-2152-421a-b39c-c1455968a17c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.330121 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/be1ee3c4-2152-421a-b39c-c1455968a17c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ln7fh\" (UID: \"be1ee3c4-2152-421a-b39c-c1455968a17c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.331885 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.332796 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/be1ee3c4-2152-421a-b39c-c1455968a17c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ln7fh\" (UID: \"be1ee3c4-2152-421a-b39c-c1455968a17c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.332918 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/be1ee3c4-2152-421a-b39c-c1455968a17c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ln7fh\" (UID: \"be1ee3c4-2152-421a-b39c-c1455968a17c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.338496 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/be1ee3c4-2152-421a-b39c-c1455968a17c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ln7fh\" (UID: \"be1ee3c4-2152-421a-b39c-c1455968a17c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.351091 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.353451 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vd9x\" (UniqueName: \"kubernetes.io/projected/be1ee3c4-2152-421a-b39c-c1455968a17c-kube-api-access-6vd9x\") pod \"ovnkube-control-plane-749d76644c-ln7fh\" (UID: \"be1ee3c4-2152-421a-b39c-c1455968a17c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.363478 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.367279 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.367398 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.367445 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.367456 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.367481 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.367499 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:01Z","lastTransitionTime":"2026-02-17T13:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:01 crc kubenswrapper[4804]: W0217 13:26:01.378828 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe1ee3c4_2152_421a_b39c_c1455968a17c.slice/crio-1b29363c9f732760ea6068fcac77f95fb20c0fc0bcb6637a1dede733620e2d76 WatchSource:0}: Error finding container 1b29363c9f732760ea6068fcac77f95fb20c0fc0bcb6637a1dede733620e2d76: Status 404 returned error can't find the container with id 1b29363c9f732760ea6068fcac77f95fb20c0fc0bcb6637a1dede733620e2d76 Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.383667 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.401780 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.419851 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.435680 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.449507 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.462896 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.470104 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.470152 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.470167 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.470196 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.470230 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:01Z","lastTransitionTime":"2026-02-17T13:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.476644 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.489012 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.504054 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.523466 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ee7fdd23a3c447937ba190cf0b0124318748ff1ff40b8d1be3bc541ddb39963\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:25:58Z\\\",\\\"message\\\":\\\"5:58.145600 6107 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:25:58.145961 6107 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 13:25:58.146060 6107 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:25:58.146122 6107 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:25:58.146233 6107 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 13:25:58.146467 6107 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 13:25:58.146849 6107 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:25:58.147451 6107 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 13:25:58.147524 6107 factory.go:656] Stopping watch factory\\\\nI0217 13:25:58.147547 6107 ovnkube.go:599] Stopped ovnkube\\\\nI0217 13:25:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:00Z\\\",\\\"message\\\":\\\"ork-node-identity-vrzqb\\\\nI0217 13:25:59.797105 6248 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0217 13:25:59.796953 6248 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0217 13:25:59.797140 6248 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI0217 13:25:59.797005 6248 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-kclvs\\\\nI0217 13:25:59.797326 6248 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-kclvs\\\\nI0217 13:25:59.797334 6248 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-kclvs in node crc\\\\nI0217 13:25:59.797340 6248 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-kclvs after 0 failed attempt(s)\\\\nI0217 13:25:59.797345 6248 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-kclvs\\\\nF0217 13:25:59.797308 6248 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.535444 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 08:34:44.183370812 +0000 UTC Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.535581 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.572467 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.572518 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.572530 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.572552 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.572566 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:01Z","lastTransitionTime":"2026-02-17T13:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.572883 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.572971 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.573010 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:01 crc kubenswrapper[4804]: E0217 13:26:01.573108 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:01 crc kubenswrapper[4804]: E0217 13:26:01.573257 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:01 crc kubenswrapper[4804]: E0217 13:26:01.573371 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.689956 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.690016 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.690030 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.690058 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.690074 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:01Z","lastTransitionTime":"2026-02-17T13:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.793244 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.793308 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.793321 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.793346 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.793361 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:01Z","lastTransitionTime":"2026-02-17T13:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.816311 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-4jfgm"] Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.817077 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:01 crc kubenswrapper[4804]: E0217 13:26:01.817174 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.833056 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.846620 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.860034 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.871610 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.884019 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77722ba-d383-442c-b6dc-9983cf233257\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jfgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.896149 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.896188 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.896214 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.896232 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.896244 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:01Z","lastTransitionTime":"2026-02-17T13:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.899318 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.913668 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.927315 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.937902 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs\") pod \"network-metrics-daemon-4jfgm\" (UID: \"e77722ba-d383-442c-b6dc-9983cf233257\") " pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.937951 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm9vb\" (UniqueName: \"kubernetes.io/projected/e77722ba-d383-442c-b6dc-9983cf233257-kube-api-access-pm9vb\") pod \"network-metrics-daemon-4jfgm\" (UID: \"e77722ba-d383-442c-b6dc-9983cf233257\") " pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.940769 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.953911 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.973222 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.986769 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:01Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.998337 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.998382 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.998393 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.998413 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:01 crc kubenswrapper[4804]: I0217 13:26:01.998423 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:01Z","lastTransitionTime":"2026-02-17T13:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.005704 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.022982 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.038759 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm9vb\" (UniqueName: \"kubernetes.io/projected/e77722ba-d383-442c-b6dc-9983cf233257-kube-api-access-pm9vb\") pod \"network-metrics-daemon-4jfgm\" (UID: \"e77722ba-d383-442c-b6dc-9983cf233257\") " pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.038884 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs\") pod \"network-metrics-daemon-4jfgm\" (UID: \"e77722ba-d383-442c-b6dc-9983cf233257\") " pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:02 crc kubenswrapper[4804]: E0217 13:26:02.039048 4804 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 13:26:02 crc kubenswrapper[4804]: E0217 13:26:02.039133 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs podName:e77722ba-d383-442c-b6dc-9983cf233257 nodeName:}" failed. No retries permitted until 2026-02-17 13:26:02.539108997 +0000 UTC m=+36.650528344 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs") pod "network-metrics-daemon-4jfgm" (UID: "e77722ba-d383-442c-b6dc-9983cf233257") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.046049 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ee7fdd23a3c447937ba190cf0b0124318748ff1ff40b8d1be3bc541ddb39963\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:25:58Z\\\",\\\"message\\\":\\\"5:58.145600 6107 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:25:58.145961 6107 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 13:25:58.146060 6107 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:25:58.146122 6107 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:25:58.146233 6107 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 13:25:58.146467 6107 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 13:25:58.146849 6107 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:25:58.147451 6107 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 13:25:58.147524 6107 factory.go:656] Stopping watch factory\\\\nI0217 13:25:58.147547 6107 ovnkube.go:599] Stopped ovnkube\\\\nI0217 13:25:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:00Z\\\",\\\"message\\\":\\\"ork-node-identity-vrzqb\\\\nI0217 13:25:59.797105 6248 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0217 13:25:59.796953 6248 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0217 13:25:59.797140 6248 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI0217 13:25:59.797005 6248 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-kclvs\\\\nI0217 13:25:59.797326 6248 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-kclvs\\\\nI0217 13:25:59.797334 6248 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-kclvs in node crc\\\\nI0217 13:25:59.797340 6248 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-kclvs after 0 failed attempt(s)\\\\nI0217 13:25:59.797345 6248 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-kclvs\\\\nF0217 13:25:59.797308 6248 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.056869 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm9vb\" (UniqueName: \"kubernetes.io/projected/e77722ba-d383-442c-b6dc-9983cf233257-kube-api-access-pm9vb\") pod \"network-metrics-daemon-4jfgm\" (UID: \"e77722ba-d383-442c-b6dc-9983cf233257\") " pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.064713 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.098055 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v8mv6_8df4e52a-e578-472b-a6b3-418e9755714f/ovnkube-controller/1.log" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.100104 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.100238 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.100316 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.100415 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.100511 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:02Z","lastTransitionTime":"2026-02-17T13:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.102268 4804 scope.go:117] "RemoveContainer" containerID="95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26" Feb 17 13:26:02 crc kubenswrapper[4804]: E0217 13:26:02.102499 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.103669 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" event={"ID":"be1ee3c4-2152-421a-b39c-c1455968a17c","Type":"ContainerStarted","Data":"ac4c208ee4d38a7bd533fcc639d66957e66782fd5e5e8d7ec7a424b52bc4808b"} Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.103729 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" event={"ID":"be1ee3c4-2152-421a-b39c-c1455968a17c","Type":"ContainerStarted","Data":"c668758ab1ef4e6a8e2001ed6d70ec830b00746cc1454488df271b664e43024a"} Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.103745 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" event={"ID":"be1ee3c4-2152-421a-b39c-c1455968a17c","Type":"ContainerStarted","Data":"1b29363c9f732760ea6068fcac77f95fb20c0fc0bcb6637a1dede733620e2d76"} Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.120397 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.142581 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:00Z\\\",\\\"message\\\":\\\"ork-node-identity-vrzqb\\\\nI0217 13:25:59.797105 6248 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0217 13:25:59.796953 6248 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0217 13:25:59.797140 6248 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI0217 13:25:59.797005 6248 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-kclvs\\\\nI0217 13:25:59.797326 6248 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-kclvs\\\\nI0217 13:25:59.797334 6248 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-kclvs in node crc\\\\nI0217 13:25:59.797340 6248 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-kclvs after 0 failed attempt(s)\\\\nI0217 13:25:59.797345 6248 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-kclvs\\\\nF0217 13:25:59.797308 6248 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.160196 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.176296 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.191967 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.202975 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.203050 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.203060 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.203079 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.203089 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:02Z","lastTransitionTime":"2026-02-17T13:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.213073 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.224540 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.236539 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77722ba-d383-442c-b6dc-9983cf233257\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jfgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.247854 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.263127 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.280315 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.298555 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.307191 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.307255 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.307266 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.307287 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.307301 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:02Z","lastTransitionTime":"2026-02-17T13:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.312583 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.328256 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.345392 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.361660 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.378967 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.393104 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.409540 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.409603 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.409621 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.409647 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.409660 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:02Z","lastTransitionTime":"2026-02-17T13:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.411244 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c668758ab1ef4e6a8e2001ed6d70ec830b00746cc1454488df271b664e43024a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4c208ee4d38a7bd533fcc639d66957e66782fd5e5e8d7ec7a424b52bc4808b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.428691 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.454797 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:00Z\\\",\\\"message\\\":\\\"ork-node-identity-vrzqb\\\\nI0217 13:25:59.797105 6248 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0217 13:25:59.796953 6248 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0217 13:25:59.797140 6248 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI0217 13:25:59.797005 6248 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-kclvs\\\\nI0217 13:25:59.797326 6248 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-kclvs\\\\nI0217 13:25:59.797334 6248 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-kclvs in node crc\\\\nI0217 13:25:59.797340 6248 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-kclvs after 0 failed attempt(s)\\\\nI0217 13:25:59.797345 6248 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-kclvs\\\\nF0217 13:25:59.797308 6248 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.472464 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.488640 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77722ba-d383-442c-b6dc-9983cf233257\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jfgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.509452 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.513599 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.513675 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.513695 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.513726 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.513744 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:02Z","lastTransitionTime":"2026-02-17T13:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.524922 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.535855 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 18:01:32.791514979 +0000 UTC Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.542596 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.545470 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs\") pod \"network-metrics-daemon-4jfgm\" (UID: \"e77722ba-d383-442c-b6dc-9983cf233257\") " pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:02 crc kubenswrapper[4804]: E0217 13:26:02.545765 4804 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 13:26:02 crc kubenswrapper[4804]: E0217 13:26:02.545908 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs podName:e77722ba-d383-442c-b6dc-9983cf233257 nodeName:}" failed. No retries permitted until 2026-02-17 13:26:03.545871215 +0000 UTC m=+37.657290772 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs") pod "network-metrics-daemon-4jfgm" (UID: "e77722ba-d383-442c-b6dc-9983cf233257") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.561788 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.575583 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.591441 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.608993 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.617237 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.617281 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.617293 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.617317 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.617337 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:02Z","lastTransitionTime":"2026-02-17T13:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.627165 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.642515 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:02Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.721700 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.722319 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.722342 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.722375 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.722400 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:02Z","lastTransitionTime":"2026-02-17T13:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.825784 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.825835 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.825847 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.825865 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.825875 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:02Z","lastTransitionTime":"2026-02-17T13:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.928709 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.928781 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.928795 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.928811 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:02 crc kubenswrapper[4804]: I0217 13:26:02.928823 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:02Z","lastTransitionTime":"2026-02-17T13:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.032713 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.032806 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.032826 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.032864 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.032886 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:03Z","lastTransitionTime":"2026-02-17T13:26:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.136422 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.136492 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.136517 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.136548 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.136566 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:03Z","lastTransitionTime":"2026-02-17T13:26:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.239642 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.239714 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.239732 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.239762 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.239783 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:03Z","lastTransitionTime":"2026-02-17T13:26:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.254402 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:26:03 crc kubenswrapper[4804]: E0217 13:26:03.254708 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:26:19.254657596 +0000 UTC m=+53.366076973 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.254982 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:03 crc kubenswrapper[4804]: E0217 13:26:03.255188 4804 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.255446 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:03 crc kubenswrapper[4804]: E0217 13:26:03.255533 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 13:26:19.255494754 +0000 UTC m=+53.366914131 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 13:26:03 crc kubenswrapper[4804]: E0217 13:26:03.255611 4804 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 13:26:03 crc kubenswrapper[4804]: E0217 13:26:03.255993 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 13:26:19.25596904 +0000 UTC m=+53.367388377 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.255915 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.256306 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:03 crc kubenswrapper[4804]: E0217 13:26:03.256120 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 13:26:03 crc kubenswrapper[4804]: E0217 13:26:03.256638 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 13:26:03 crc kubenswrapper[4804]: E0217 13:26:03.256783 4804 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:26:03 crc kubenswrapper[4804]: E0217 13:26:03.256973 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 13:26:19.256948281 +0000 UTC m=+53.368367658 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:26:03 crc kubenswrapper[4804]: E0217 13:26:03.256431 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 13:26:03 crc kubenswrapper[4804]: E0217 13:26:03.257284 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 13:26:03 crc kubenswrapper[4804]: E0217 13:26:03.257425 4804 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:26:03 crc kubenswrapper[4804]: E0217 13:26:03.257600 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 13:26:19.257582272 +0000 UTC m=+53.369001649 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.343965 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.344054 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.344077 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.344116 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.344142 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:03Z","lastTransitionTime":"2026-02-17T13:26:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.447714 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.447794 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.447819 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.447857 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.447887 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:03Z","lastTransitionTime":"2026-02-17T13:26:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.536562 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 11:38:08.567342933 +0000 UTC Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.551361 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.551423 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.551442 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.551467 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.551484 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:03Z","lastTransitionTime":"2026-02-17T13:26:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.560483 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs\") pod \"network-metrics-daemon-4jfgm\" (UID: \"e77722ba-d383-442c-b6dc-9983cf233257\") " pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:03 crc kubenswrapper[4804]: E0217 13:26:03.560755 4804 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 13:26:03 crc kubenswrapper[4804]: E0217 13:26:03.560858 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs podName:e77722ba-d383-442c-b6dc-9983cf233257 nodeName:}" failed. No retries permitted until 2026-02-17 13:26:05.560833729 +0000 UTC m=+39.672253076 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs") pod "network-metrics-daemon-4jfgm" (UID: "e77722ba-d383-442c-b6dc-9983cf233257") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.573817 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.573920 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.573927 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:03 crc kubenswrapper[4804]: E0217 13:26:03.574066 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:03 crc kubenswrapper[4804]: E0217 13:26:03.574272 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.574399 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:03 crc kubenswrapper[4804]: E0217 13:26:03.574599 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:03 crc kubenswrapper[4804]: E0217 13:26:03.574776 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.654797 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.654855 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.654870 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.654902 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.654920 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:03Z","lastTransitionTime":"2026-02-17T13:26:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.759376 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.759435 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.759452 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.759480 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.759498 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:03Z","lastTransitionTime":"2026-02-17T13:26:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.864004 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.864072 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.864090 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.864118 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.864141 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:03Z","lastTransitionTime":"2026-02-17T13:26:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.958690 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.967050 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.967098 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.967121 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.967149 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.967171 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:03Z","lastTransitionTime":"2026-02-17T13:26:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:03 crc kubenswrapper[4804]: I0217 13:26:03.977711 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:03Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.007421 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:00Z\\\",\\\"message\\\":\\\"ork-node-identity-vrzqb\\\\nI0217 13:25:59.797105 6248 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0217 13:25:59.796953 6248 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0217 13:25:59.797140 6248 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI0217 13:25:59.797005 6248 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-kclvs\\\\nI0217 13:25:59.797326 6248 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-kclvs\\\\nI0217 13:25:59.797334 6248 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-kclvs in node crc\\\\nI0217 13:25:59.797340 6248 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-kclvs after 0 failed attempt(s)\\\\nI0217 13:25:59.797345 6248 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-kclvs\\\\nF0217 13:25:59.797308 6248 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:04Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.024844 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c668758ab1ef4e6a8e2001ed6d70ec830b00746cc1454488df271b664e43024a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4c208ee4d38a7bd533fcc639d66957e66782fd5e5e8d7ec7a424b52bc4808b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:04Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.047231 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:04Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.070024 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.070099 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.070120 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.070151 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.070176 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:04Z","lastTransitionTime":"2026-02-17T13:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.070776 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:04Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.090945 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:04Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.107805 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:04Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.124788 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77722ba-d383-442c-b6dc-9983cf233257\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jfgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:04Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.135881 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:04Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.148379 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:04Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.167518 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:04Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.172661 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.172709 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.172722 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.172743 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.172755 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:04Z","lastTransitionTime":"2026-02-17T13:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.181910 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.182040 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.182104 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.182188 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.182285 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:04Z","lastTransitionTime":"2026-02-17T13:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.183720 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:04Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.197380 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:04Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: E0217 13:26:04.203545 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:04Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.208189 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.208341 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.208413 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.208495 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.208562 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:04Z","lastTransitionTime":"2026-02-17T13:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.210406 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:04Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: E0217 13:26:04.225490 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:04Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.231712 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.231761 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.231774 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.231796 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.231812 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:04Z","lastTransitionTime":"2026-02-17T13:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.232086 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:04Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: E0217 13:26:04.246489 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:04Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.247751 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:04Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.250806 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.250918 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.250988 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.251066 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.251135 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:04Z","lastTransitionTime":"2026-02-17T13:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:04 crc kubenswrapper[4804]: E0217 13:26:04.269389 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:04Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.273428 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.273461 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.273472 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.273495 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.273509 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:04Z","lastTransitionTime":"2026-02-17T13:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:04 crc kubenswrapper[4804]: E0217 13:26:04.285064 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:04Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:04 crc kubenswrapper[4804]: E0217 13:26:04.285333 4804 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.287360 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.287445 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.287463 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.287490 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.287504 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:04Z","lastTransitionTime":"2026-02-17T13:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.390000 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.390082 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.390099 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.390140 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.390156 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:04Z","lastTransitionTime":"2026-02-17T13:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.492533 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.492621 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.492638 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.492668 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.492705 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:04Z","lastTransitionTime":"2026-02-17T13:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.538034 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 06:37:19.965498347 +0000 UTC Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.596085 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.596136 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.596148 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.596172 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.596187 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:04Z","lastTransitionTime":"2026-02-17T13:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.699694 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.699775 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.699793 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.699823 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.699847 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:04Z","lastTransitionTime":"2026-02-17T13:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.802540 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.802586 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.802597 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.802619 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.802631 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:04Z","lastTransitionTime":"2026-02-17T13:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.905904 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.906007 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.906034 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.906096 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:04 crc kubenswrapper[4804]: I0217 13:26:04.906125 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:04Z","lastTransitionTime":"2026-02-17T13:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.009885 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.009958 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.009968 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.009989 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.010004 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:05Z","lastTransitionTime":"2026-02-17T13:26:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.112764 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.112853 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.112869 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.112899 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.112916 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:05Z","lastTransitionTime":"2026-02-17T13:26:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.216179 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.216232 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.216242 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.216262 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.216275 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:05Z","lastTransitionTime":"2026-02-17T13:26:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.318919 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.318996 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.319009 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.319036 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.319053 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:05Z","lastTransitionTime":"2026-02-17T13:26:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.421416 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.421460 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.421471 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.421487 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.421497 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:05Z","lastTransitionTime":"2026-02-17T13:26:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.524056 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.524141 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.524166 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.524239 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.524263 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:05Z","lastTransitionTime":"2026-02-17T13:26:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.539478 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 11:31:38.817532964 +0000 UTC Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.573408 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.573464 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.573469 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.573408 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:05 crc kubenswrapper[4804]: E0217 13:26:05.573608 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:05 crc kubenswrapper[4804]: E0217 13:26:05.573741 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:05 crc kubenswrapper[4804]: E0217 13:26:05.574028 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:05 crc kubenswrapper[4804]: E0217 13:26:05.574125 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.584231 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs\") pod \"network-metrics-daemon-4jfgm\" (UID: \"e77722ba-d383-442c-b6dc-9983cf233257\") " pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:05 crc kubenswrapper[4804]: E0217 13:26:05.584363 4804 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 13:26:05 crc kubenswrapper[4804]: E0217 13:26:05.584411 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs podName:e77722ba-d383-442c-b6dc-9983cf233257 nodeName:}" failed. No retries permitted until 2026-02-17 13:26:09.584398126 +0000 UTC m=+43.695817463 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs") pod "network-metrics-daemon-4jfgm" (UID: "e77722ba-d383-442c-b6dc-9983cf233257") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.626812 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.626883 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.626901 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.626933 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.626958 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:05Z","lastTransitionTime":"2026-02-17T13:26:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.730134 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.730192 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.730226 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.730249 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.730268 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:05Z","lastTransitionTime":"2026-02-17T13:26:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.833270 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.833316 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.833326 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.833342 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.833352 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:05Z","lastTransitionTime":"2026-02-17T13:26:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.937243 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.937320 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.937338 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.937364 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:05 crc kubenswrapper[4804]: I0217 13:26:05.937387 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:05Z","lastTransitionTime":"2026-02-17T13:26:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.041237 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.041341 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.041365 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.041399 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.041423 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:06Z","lastTransitionTime":"2026-02-17T13:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.144923 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.145003 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.145024 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.145057 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.145079 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:06Z","lastTransitionTime":"2026-02-17T13:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.248537 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.248619 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.248641 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.248675 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.248701 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:06Z","lastTransitionTime":"2026-02-17T13:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.352744 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.352841 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.352861 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.352906 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.352927 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:06Z","lastTransitionTime":"2026-02-17T13:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.456518 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.456596 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.456637 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.456678 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.456703 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:06Z","lastTransitionTime":"2026-02-17T13:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.540092 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 13:36:39.214837998 +0000 UTC Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.559539 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.559603 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.559624 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.559653 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.559671 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:06Z","lastTransitionTime":"2026-02-17T13:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.595717 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:06Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.614697 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:06Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.632688 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:06Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.662054 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.662124 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.662150 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.662181 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.662236 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:06Z","lastTransitionTime":"2026-02-17T13:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.668364 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:00Z\\\",\\\"message\\\":\\\"ork-node-identity-vrzqb\\\\nI0217 13:25:59.797105 6248 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0217 13:25:59.796953 6248 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0217 13:25:59.797140 6248 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI0217 13:25:59.797005 6248 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-kclvs\\\\nI0217 13:25:59.797326 6248 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-kclvs\\\\nI0217 13:25:59.797334 6248 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-kclvs in node crc\\\\nI0217 13:25:59.797340 6248 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-kclvs after 0 failed attempt(s)\\\\nI0217 13:25:59.797345 6248 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-kclvs\\\\nF0217 13:25:59.797308 6248 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:06Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.685568 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c668758ab1ef4e6a8e2001ed6d70ec830b00746cc1454488df271b664e43024a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4c208ee4d38a7bd533fcc639d66957e66782fd5e5e8d7ec7a424b52bc4808b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:06Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.700444 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:06Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.713977 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:06Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.728850 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:06Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.742689 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77722ba-d383-442c-b6dc-9983cf233257\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jfgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:06Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.759330 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:06Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.764155 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.764251 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.764297 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.764321 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.764333 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:06Z","lastTransitionTime":"2026-02-17T13:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.774241 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:06Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.788920 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:06Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.799318 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:06Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.811117 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:06Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.828713 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:06Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.844263 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:06Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.867022 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.867063 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.867073 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.867088 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.867099 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:06Z","lastTransitionTime":"2026-02-17T13:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.969650 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.969691 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.969700 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.969713 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:06 crc kubenswrapper[4804]: I0217 13:26:06.969723 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:06Z","lastTransitionTime":"2026-02-17T13:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.072986 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.073036 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.073045 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.073059 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.073068 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:07Z","lastTransitionTime":"2026-02-17T13:26:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.175056 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.175089 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.175098 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.175111 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.175119 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:07Z","lastTransitionTime":"2026-02-17T13:26:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.277609 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.277650 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.277660 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.277676 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.277686 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:07Z","lastTransitionTime":"2026-02-17T13:26:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.380146 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.380248 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.380274 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.380306 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.380342 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:07Z","lastTransitionTime":"2026-02-17T13:26:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.483064 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.483097 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.483106 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.483125 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.483134 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:07Z","lastTransitionTime":"2026-02-17T13:26:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.540825 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 20:07:53.100965234 +0000 UTC Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.573356 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.573438 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.573522 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:07 crc kubenswrapper[4804]: E0217 13:26:07.573528 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.573400 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:07 crc kubenswrapper[4804]: E0217 13:26:07.573642 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:07 crc kubenswrapper[4804]: E0217 13:26:07.573769 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:07 crc kubenswrapper[4804]: E0217 13:26:07.573830 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.586036 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.586084 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.586101 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.586122 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.586142 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:07Z","lastTransitionTime":"2026-02-17T13:26:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.688921 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.688980 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.688996 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.689020 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.689038 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:07Z","lastTransitionTime":"2026-02-17T13:26:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.793008 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.793086 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.793104 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.793129 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.793147 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:07Z","lastTransitionTime":"2026-02-17T13:26:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.896869 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.896933 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.896957 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.896981 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.896998 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:07Z","lastTransitionTime":"2026-02-17T13:26:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.999682 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.999728 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.999743 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:07 crc kubenswrapper[4804]: I0217 13:26:07.999761 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:07.999772 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:07Z","lastTransitionTime":"2026-02-17T13:26:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.102567 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.102620 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.102631 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.102648 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.102657 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:08Z","lastTransitionTime":"2026-02-17T13:26:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.205918 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.205956 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.205965 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.205982 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.205991 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:08Z","lastTransitionTime":"2026-02-17T13:26:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.309041 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.309078 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.309087 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.309102 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.309111 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:08Z","lastTransitionTime":"2026-02-17T13:26:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.412251 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.412611 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.412712 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.412813 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.412907 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:08Z","lastTransitionTime":"2026-02-17T13:26:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.515980 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.516025 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.516035 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.516051 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.516060 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:08Z","lastTransitionTime":"2026-02-17T13:26:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.541151 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 21:25:41.222811274 +0000 UTC Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.619998 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.620407 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.620653 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.620799 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.620925 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:08Z","lastTransitionTime":"2026-02-17T13:26:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.722931 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.722969 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.722980 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.722998 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.723010 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:08Z","lastTransitionTime":"2026-02-17T13:26:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.826242 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.826284 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.826295 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.826310 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.826319 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:08Z","lastTransitionTime":"2026-02-17T13:26:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.928939 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.928982 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.928992 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.929009 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:08 crc kubenswrapper[4804]: I0217 13:26:08.929022 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:08Z","lastTransitionTime":"2026-02-17T13:26:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.032493 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.032527 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.032538 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.032555 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.032568 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:09Z","lastTransitionTime":"2026-02-17T13:26:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.135104 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.135212 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.135228 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.135252 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.135266 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:09Z","lastTransitionTime":"2026-02-17T13:26:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.238289 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.238321 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.238330 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.238346 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.238355 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:09Z","lastTransitionTime":"2026-02-17T13:26:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.341382 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.341463 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.341482 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.341516 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.341535 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:09Z","lastTransitionTime":"2026-02-17T13:26:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.444896 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.444952 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.444973 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.445008 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.445048 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:09Z","lastTransitionTime":"2026-02-17T13:26:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.542395 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 15:33:28.809565165 +0000 UTC Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.547615 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.547734 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.547753 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.547778 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.547798 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:09Z","lastTransitionTime":"2026-02-17T13:26:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.573700 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.573735 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.573773 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:09 crc kubenswrapper[4804]: E0217 13:26:09.573871 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.573984 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:09 crc kubenswrapper[4804]: E0217 13:26:09.576494 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:09 crc kubenswrapper[4804]: E0217 13:26:09.576851 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:09 crc kubenswrapper[4804]: E0217 13:26:09.577070 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.628834 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs\") pod \"network-metrics-daemon-4jfgm\" (UID: \"e77722ba-d383-442c-b6dc-9983cf233257\") " pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:09 crc kubenswrapper[4804]: E0217 13:26:09.629059 4804 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 13:26:09 crc kubenswrapper[4804]: E0217 13:26:09.629225 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs podName:e77722ba-d383-442c-b6dc-9983cf233257 nodeName:}" failed. No retries permitted until 2026-02-17 13:26:17.629175075 +0000 UTC m=+51.740594412 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs") pod "network-metrics-daemon-4jfgm" (UID: "e77722ba-d383-442c-b6dc-9983cf233257") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.650853 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.650899 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.650917 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.650940 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.650958 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:09Z","lastTransitionTime":"2026-02-17T13:26:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.754553 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.754628 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.754647 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.754678 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.754697 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:09Z","lastTransitionTime":"2026-02-17T13:26:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.858020 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.858075 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.858085 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.858107 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:09 crc kubenswrapper[4804]: I0217 13:26:09.858117 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:09Z","lastTransitionTime":"2026-02-17T13:26:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.074274 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.074339 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.074351 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.074373 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.074388 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:10Z","lastTransitionTime":"2026-02-17T13:26:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.191389 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.191501 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.191517 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.191540 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.191552 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:10Z","lastTransitionTime":"2026-02-17T13:26:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.293612 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.293656 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.293665 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.293679 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.293688 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:10Z","lastTransitionTime":"2026-02-17T13:26:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.395962 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.395998 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.396006 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.396023 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.396034 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:10Z","lastTransitionTime":"2026-02-17T13:26:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.499056 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.499096 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.499106 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.499122 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.499132 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:10Z","lastTransitionTime":"2026-02-17T13:26:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.543356 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 02:49:37.536911094 +0000 UTC Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.601688 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.601717 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.601726 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.601742 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.601755 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:10Z","lastTransitionTime":"2026-02-17T13:26:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.704858 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.704888 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.704896 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.704910 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.704918 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:10Z","lastTransitionTime":"2026-02-17T13:26:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.806624 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.806668 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.806679 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.806694 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.806703 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:10Z","lastTransitionTime":"2026-02-17T13:26:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.909277 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.909313 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.909335 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.909349 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:10 crc kubenswrapper[4804]: I0217 13:26:10.909358 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:10Z","lastTransitionTime":"2026-02-17T13:26:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.012886 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.012940 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.012950 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.012977 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.012989 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:11Z","lastTransitionTime":"2026-02-17T13:26:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.115089 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.115146 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.115168 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.115185 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.115213 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:11Z","lastTransitionTime":"2026-02-17T13:26:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.217355 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.217445 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.217467 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.217503 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.217524 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:11Z","lastTransitionTime":"2026-02-17T13:26:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.319921 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.319969 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.319981 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.319999 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.320014 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:11Z","lastTransitionTime":"2026-02-17T13:26:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.422750 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.422793 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.422802 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.422824 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.422835 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:11Z","lastTransitionTime":"2026-02-17T13:26:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.526017 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.526073 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.526083 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.526098 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.526111 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:11Z","lastTransitionTime":"2026-02-17T13:26:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.543603 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 06:29:57.253933385 +0000 UTC Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.572945 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.573058 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.573080 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:11 crc kubenswrapper[4804]: E0217 13:26:11.573156 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.573331 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:11 crc kubenswrapper[4804]: E0217 13:26:11.573331 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:11 crc kubenswrapper[4804]: E0217 13:26:11.573427 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:11 crc kubenswrapper[4804]: E0217 13:26:11.573496 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.628607 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.628656 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.628669 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.628690 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.628705 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:11Z","lastTransitionTime":"2026-02-17T13:26:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.730738 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.730805 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.730818 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.730839 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.730859 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:11Z","lastTransitionTime":"2026-02-17T13:26:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.833547 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.833604 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.833615 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.833633 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.833645 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:11Z","lastTransitionTime":"2026-02-17T13:26:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.936019 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.936065 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.936073 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.936089 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:11 crc kubenswrapper[4804]: I0217 13:26:11.936100 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:11Z","lastTransitionTime":"2026-02-17T13:26:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.038990 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.039052 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.039064 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.039083 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.039096 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:12Z","lastTransitionTime":"2026-02-17T13:26:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.141071 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.141124 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.141137 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.141153 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.141163 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:12Z","lastTransitionTime":"2026-02-17T13:26:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.243822 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.243863 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.243875 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.243892 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.243905 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:12Z","lastTransitionTime":"2026-02-17T13:26:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.346365 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.346432 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.346445 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.346462 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.346473 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:12Z","lastTransitionTime":"2026-02-17T13:26:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.491176 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.491244 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.491257 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.491275 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.491285 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:12Z","lastTransitionTime":"2026-02-17T13:26:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.543753 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 20:40:55.22632969 +0000 UTC Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.593626 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.593675 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.593687 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.593707 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.593721 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:12Z","lastTransitionTime":"2026-02-17T13:26:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.697051 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.697456 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.697525 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.697599 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.697670 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:12Z","lastTransitionTime":"2026-02-17T13:26:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.800743 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.800797 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.800810 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.800830 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.800840 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:12Z","lastTransitionTime":"2026-02-17T13:26:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.904168 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.904299 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.904326 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.904364 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:12 crc kubenswrapper[4804]: I0217 13:26:12.904390 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:12Z","lastTransitionTime":"2026-02-17T13:26:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.007505 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.007554 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.007564 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.007583 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.007592 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:13Z","lastTransitionTime":"2026-02-17T13:26:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.110596 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.110651 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.110663 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.110684 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.110699 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:13Z","lastTransitionTime":"2026-02-17T13:26:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.214280 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.214360 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.214393 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.214430 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.214456 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:13Z","lastTransitionTime":"2026-02-17T13:26:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.318479 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.318568 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.318591 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.318626 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.318647 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:13Z","lastTransitionTime":"2026-02-17T13:26:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.422519 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.422609 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.422633 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.422673 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.422703 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:13Z","lastTransitionTime":"2026-02-17T13:26:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.526871 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.526950 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.526969 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.527000 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.527028 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:13Z","lastTransitionTime":"2026-02-17T13:26:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.544298 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 22:01:47.9271917 +0000 UTC Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.573909 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.574085 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.574138 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.573952 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:13 crc kubenswrapper[4804]: E0217 13:26:13.574253 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:13 crc kubenswrapper[4804]: E0217 13:26:13.574370 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:13 crc kubenswrapper[4804]: E0217 13:26:13.574517 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:13 crc kubenswrapper[4804]: E0217 13:26:13.574645 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.630353 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.630413 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.630439 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.630480 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.630513 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:13Z","lastTransitionTime":"2026-02-17T13:26:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.734367 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.734458 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.734485 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.734522 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.734547 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:13Z","lastTransitionTime":"2026-02-17T13:26:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.837984 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.838064 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.838083 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.838123 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.838146 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:13Z","lastTransitionTime":"2026-02-17T13:26:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.941535 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.941911 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.942058 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.942182 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:13 crc kubenswrapper[4804]: I0217 13:26:13.942444 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:13Z","lastTransitionTime":"2026-02-17T13:26:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.046769 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.046854 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.046878 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.046912 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.046931 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:14Z","lastTransitionTime":"2026-02-17T13:26:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.150051 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.150173 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.150249 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.150289 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.150316 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:14Z","lastTransitionTime":"2026-02-17T13:26:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.254143 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.254245 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.254257 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.254280 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.254292 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:14Z","lastTransitionTime":"2026-02-17T13:26:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.358463 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.358551 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.358574 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.358603 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.358629 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:14Z","lastTransitionTime":"2026-02-17T13:26:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.463442 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.463509 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.463534 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.463566 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.463592 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:14Z","lastTransitionTime":"2026-02-17T13:26:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.544983 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 15:54:30.320974709 +0000 UTC Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.567357 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.567447 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.567473 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.567510 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.567535 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:14Z","lastTransitionTime":"2026-02-17T13:26:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.575374 4804 scope.go:117] "RemoveContainer" containerID="95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.659597 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.659675 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.659703 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.659739 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.659764 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:14Z","lastTransitionTime":"2026-02-17T13:26:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:14 crc kubenswrapper[4804]: E0217 13:26:14.692092 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:14Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.700674 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.700796 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.700838 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.700876 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.700902 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:14Z","lastTransitionTime":"2026-02-17T13:26:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:14 crc kubenswrapper[4804]: E0217 13:26:14.725435 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:14Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.731517 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.731562 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.731579 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.731608 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.731627 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:14Z","lastTransitionTime":"2026-02-17T13:26:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:14 crc kubenswrapper[4804]: E0217 13:26:14.754737 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:14Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.760603 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.760672 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.760699 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.760736 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.760764 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:14Z","lastTransitionTime":"2026-02-17T13:26:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:14 crc kubenswrapper[4804]: E0217 13:26:14.786739 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:14Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.792849 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.792902 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.792921 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.792953 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.792974 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:14Z","lastTransitionTime":"2026-02-17T13:26:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:14 crc kubenswrapper[4804]: E0217 13:26:14.817904 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:14Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:14 crc kubenswrapper[4804]: E0217 13:26:14.818358 4804 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.822657 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.822729 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.822743 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.822770 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.822785 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:14Z","lastTransitionTime":"2026-02-17T13:26:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.925087 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.925119 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.925131 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.925151 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:14 crc kubenswrapper[4804]: I0217 13:26:14.925164 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:14Z","lastTransitionTime":"2026-02-17T13:26:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.028971 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.029053 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.029081 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.029113 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.029132 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:15Z","lastTransitionTime":"2026-02-17T13:26:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.132560 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.132657 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.132669 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.132694 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.132708 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:15Z","lastTransitionTime":"2026-02-17T13:26:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.156687 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v8mv6_8df4e52a-e578-472b-a6b3-418e9755714f/ovnkube-controller/1.log" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.160435 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerStarted","Data":"e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701"} Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.161533 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.182115 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77722ba-d383-442c-b6dc-9983cf233257\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jfgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:15Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.201850 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:15Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.218956 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:15Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.235242 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.235296 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.235308 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.235327 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.235338 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:15Z","lastTransitionTime":"2026-02-17T13:26:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.237801 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:15Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.249876 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:15Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.267583 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:15Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.295294 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:15Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.313827 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:15Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.330041 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:15Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.337690 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.337745 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.337761 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.337786 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.337798 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:15Z","lastTransitionTime":"2026-02-17T13:26:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.345798 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:15Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.361079 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:15Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.375815 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:15Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.390904 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:15Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.408947 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:15Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.432344 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:00Z\\\",\\\"message\\\":\\\"ork-node-identity-vrzqb\\\\nI0217 13:25:59.797105 6248 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0217 13:25:59.796953 6248 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0217 13:25:59.797140 6248 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI0217 13:25:59.797005 6248 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-kclvs\\\\nI0217 13:25:59.797326 6248 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-kclvs\\\\nI0217 13:25:59.797334 6248 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-kclvs in node crc\\\\nI0217 13:25:59.797340 6248 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-kclvs after 0 failed attempt(s)\\\\nI0217 13:25:59.797345 6248 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-kclvs\\\\nF0217 13:25:59.797308 6248 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:15Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.443620 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.443687 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.443701 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.443724 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.443738 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:15Z","lastTransitionTime":"2026-02-17T13:26:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.456571 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c668758ab1ef4e6a8e2001ed6d70ec830b00746cc1454488df271b664e43024a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4c208ee4d38a7bd533fcc639d66957e66782fd5e5e8d7ec7a424b52bc4808b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:15Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.545131 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 07:13:17.75859469 +0000 UTC Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.546617 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.546673 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.546687 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.546710 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.546722 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:15Z","lastTransitionTime":"2026-02-17T13:26:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.573408 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.573477 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.573512 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.573550 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:15 crc kubenswrapper[4804]: E0217 13:26:15.573593 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:15 crc kubenswrapper[4804]: E0217 13:26:15.573721 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:15 crc kubenswrapper[4804]: E0217 13:26:15.573824 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:15 crc kubenswrapper[4804]: E0217 13:26:15.574177 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.650413 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.650473 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.650487 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.650508 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.650522 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:15Z","lastTransitionTime":"2026-02-17T13:26:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.753780 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.753831 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.753844 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.753866 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.753881 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:15Z","lastTransitionTime":"2026-02-17T13:26:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.857962 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.858034 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.858052 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.858087 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.858108 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:15Z","lastTransitionTime":"2026-02-17T13:26:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.962800 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.962879 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.962905 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.962944 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:15 crc kubenswrapper[4804]: I0217 13:26:15.962968 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:15Z","lastTransitionTime":"2026-02-17T13:26:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.066485 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.066561 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.066583 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.066613 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.066633 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:16Z","lastTransitionTime":"2026-02-17T13:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.166657 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v8mv6_8df4e52a-e578-472b-a6b3-418e9755714f/ovnkube-controller/2.log" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.167593 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v8mv6_8df4e52a-e578-472b-a6b3-418e9755714f/ovnkube-controller/1.log" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.169424 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.169576 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.169607 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.169644 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.169700 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:16Z","lastTransitionTime":"2026-02-17T13:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.172392 4804 generic.go:334] "Generic (PLEG): container finished" podID="8df4e52a-e578-472b-a6b3-418e9755714f" containerID="e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701" exitCode=1 Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.172496 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerDied","Data":"e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701"} Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.172593 4804 scope.go:117] "RemoveContainer" containerID="95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.174099 4804 scope.go:117] "RemoveContainer" containerID="e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701" Feb 17 13:26:16 crc kubenswrapper[4804]: E0217 13:26:16.174448 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.194863 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.221010 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.248817 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.274189 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.274256 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.274271 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.274295 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.274307 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:16Z","lastTransitionTime":"2026-02-17T13:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.285931 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:00Z\\\",\\\"message\\\":\\\"ork-node-identity-vrzqb\\\\nI0217 13:25:59.797105 6248 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0217 13:25:59.796953 6248 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0217 13:25:59.797140 6248 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI0217 13:25:59.797005 6248 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-kclvs\\\\nI0217 13:25:59.797326 6248 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-kclvs\\\\nI0217 13:25:59.797334 6248 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-kclvs in node crc\\\\nI0217 13:25:59.797340 6248 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-kclvs after 0 failed attempt(s)\\\\nI0217 13:25:59.797345 6248 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-kclvs\\\\nF0217 13:25:59.797308 6248 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:15Z\\\",\\\"message\\\":\\\" 6456 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661364 6456 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661403 6456 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661438 6456 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.671636 6456 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 13:26:15.671766 6456 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 13:26:15.671804 6456 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 13:26:15.671975 6456 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 13:26:15.671988 6456 factory.go:656] Stopping watch factory\\\\nI0217 13:26:15.690620 6456 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0217 13:26:15.690795 6456 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0217 13:26:15.690976 6456 ovnkube.go:599] Stopped ovnkube\\\\nI0217 13:26:15.691073 6456 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 13:26:15.691307 6456 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.307559 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c668758ab1ef4e6a8e2001ed6d70ec830b00746cc1454488df271b664e43024a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4c208ee4d38a7bd533fcc639d66957e66782fd5e5e8d7ec7a424b52bc4808b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.330714 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.348508 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.368262 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.376962 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.377057 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.377080 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.377109 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.377126 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:16Z","lastTransitionTime":"2026-02-17T13:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.388615 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.401177 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77722ba-d383-442c-b6dc-9983cf233257\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jfgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.414890 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.427558 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.440641 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.454527 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.467383 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.480121 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.480162 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.480170 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.480190 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.480222 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:16Z","lastTransitionTime":"2026-02-17T13:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.484685 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.545544 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 10:34:58.136003398 +0000 UTC Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.582784 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.582829 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.582841 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.582857 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.582868 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:16Z","lastTransitionTime":"2026-02-17T13:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.590468 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.610700 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95379ad376aaf78838db9d642af0e221709e9becf9072ca5395db3e6dd815e26\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:00Z\\\",\\\"message\\\":\\\"ork-node-identity-vrzqb\\\\nI0217 13:25:59.797105 6248 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0217 13:25:59.796953 6248 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0217 13:25:59.797140 6248 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI0217 13:25:59.797005 6248 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-kclvs\\\\nI0217 13:25:59.797326 6248 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-kclvs\\\\nI0217 13:25:59.797334 6248 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-kclvs in node crc\\\\nI0217 13:25:59.797340 6248 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-kclvs after 0 failed attempt(s)\\\\nI0217 13:25:59.797345 6248 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-kclvs\\\\nF0217 13:25:59.797308 6248 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:15Z\\\",\\\"message\\\":\\\" 6456 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661364 6456 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661403 6456 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661438 6456 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.671636 6456 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 13:26:15.671766 6456 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 13:26:15.671804 6456 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 13:26:15.671975 6456 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 13:26:15.671988 6456 factory.go:656] Stopping watch factory\\\\nI0217 13:26:15.690620 6456 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0217 13:26:15.690795 6456 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0217 13:26:15.690976 6456 ovnkube.go:599] Stopped ovnkube\\\\nI0217 13:26:15.691073 6456 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 13:26:15.691307 6456 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.632237 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c668758ab1ef4e6a8e2001ed6d70ec830b00746cc1454488df271b664e43024a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4c208ee4d38a7bd533fcc639d66957e66782fd5e5e8d7ec7a424b52bc4808b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.651830 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.669976 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.683931 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.685236 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.685364 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.685427 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.685583 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.685654 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:16Z","lastTransitionTime":"2026-02-17T13:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.697871 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.712262 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77722ba-d383-442c-b6dc-9983cf233257\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jfgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.724220 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.739759 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.755639 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.765652 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.784134 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.787899 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.788021 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.788084 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.788153 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.788226 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:16Z","lastTransitionTime":"2026-02-17T13:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.800963 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.815113 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.828646 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:16Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.892636 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.892685 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.892697 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.892720 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.892736 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:16Z","lastTransitionTime":"2026-02-17T13:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.995986 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.996063 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.996079 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.996107 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:16 crc kubenswrapper[4804]: I0217 13:26:16.996121 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:16Z","lastTransitionTime":"2026-02-17T13:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.100362 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.100450 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.100477 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.100515 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.100540 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:17Z","lastTransitionTime":"2026-02-17T13:26:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.178918 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v8mv6_8df4e52a-e578-472b-a6b3-418e9755714f/ovnkube-controller/2.log" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.184623 4804 scope.go:117] "RemoveContainer" containerID="e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701" Feb 17 13:26:17 crc kubenswrapper[4804]: E0217 13:26:17.184987 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.203344 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.203389 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.203404 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.203428 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.203445 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:17Z","lastTransitionTime":"2026-02-17T13:26:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.208016 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:17Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.240908 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:15Z\\\",\\\"message\\\":\\\" 6456 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661364 6456 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661403 6456 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661438 6456 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.671636 6456 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 13:26:15.671766 6456 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 13:26:15.671804 6456 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 13:26:15.671975 6456 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 13:26:15.671988 6456 factory.go:656] Stopping watch factory\\\\nI0217 13:26:15.690620 6456 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0217 13:26:15.690795 6456 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0217 13:26:15.690976 6456 ovnkube.go:599] Stopped ovnkube\\\\nI0217 13:26:15.691073 6456 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 13:26:15.691307 6456 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:26:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:17Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.257046 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c668758ab1ef4e6a8e2001ed6d70ec830b00746cc1454488df271b664e43024a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4c208ee4d38a7bd533fcc639d66957e66782fd5e5e8d7ec7a424b52bc4808b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:17Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.280885 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:17Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.296325 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:17Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.306921 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.307238 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.307400 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.307567 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.307706 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:17Z","lastTransitionTime":"2026-02-17T13:26:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.311622 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:17Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.323725 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:17Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.338091 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77722ba-d383-442c-b6dc-9983cf233257\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jfgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:17Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.355342 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:17Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.376478 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:17Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.393404 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:17Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.409736 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:17Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.410957 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.411086 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.411188 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.411302 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.411386 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:17Z","lastTransitionTime":"2026-02-17T13:26:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.421817 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:17Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.432374 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:17Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.443323 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:17Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.455842 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:17Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.515141 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.515285 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.515315 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.515355 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.515383 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:17Z","lastTransitionTime":"2026-02-17T13:26:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.546706 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 12:28:24.206636108 +0000 UTC Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.573304 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.573450 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:17 crc kubenswrapper[4804]: E0217 13:26:17.573522 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.573588 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:17 crc kubenswrapper[4804]: E0217 13:26:17.573699 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:17 crc kubenswrapper[4804]: E0217 13:26:17.573779 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.573916 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:17 crc kubenswrapper[4804]: E0217 13:26:17.574158 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.619392 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.619448 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.619461 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.619483 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.619499 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:17Z","lastTransitionTime":"2026-02-17T13:26:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.651034 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs\") pod \"network-metrics-daemon-4jfgm\" (UID: \"e77722ba-d383-442c-b6dc-9983cf233257\") " pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:17 crc kubenswrapper[4804]: E0217 13:26:17.651259 4804 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 13:26:17 crc kubenswrapper[4804]: E0217 13:26:17.651370 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs podName:e77722ba-d383-442c-b6dc-9983cf233257 nodeName:}" failed. No retries permitted until 2026-02-17 13:26:33.651345226 +0000 UTC m=+67.762764753 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs") pod "network-metrics-daemon-4jfgm" (UID: "e77722ba-d383-442c-b6dc-9983cf233257") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.722923 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.723016 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.723051 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.723091 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.723147 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:17Z","lastTransitionTime":"2026-02-17T13:26:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.825782 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.825839 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.825853 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.825875 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.825890 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:17Z","lastTransitionTime":"2026-02-17T13:26:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.929210 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.929254 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.929264 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.929282 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:17 crc kubenswrapper[4804]: I0217 13:26:17.929292 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:17Z","lastTransitionTime":"2026-02-17T13:26:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.031911 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.032059 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.032070 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.032086 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.032097 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:18Z","lastTransitionTime":"2026-02-17T13:26:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.134539 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.134611 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.134629 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.134662 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.134682 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:18Z","lastTransitionTime":"2026-02-17T13:26:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.238183 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.238267 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.238278 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.238304 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.238318 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:18Z","lastTransitionTime":"2026-02-17T13:26:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.340945 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.340986 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.340998 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.341018 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.341032 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:18Z","lastTransitionTime":"2026-02-17T13:26:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.444400 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.444983 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.445397 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.445592 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.445739 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:18Z","lastTransitionTime":"2026-02-17T13:26:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.546960 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 23:04:42.089267373 +0000 UTC Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.549426 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.549482 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.549495 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.549520 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.549534 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:18Z","lastTransitionTime":"2026-02-17T13:26:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.652473 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.652536 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.652555 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.652581 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.652599 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:18Z","lastTransitionTime":"2026-02-17T13:26:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.754686 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.754862 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.754934 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.755007 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.755066 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:18Z","lastTransitionTime":"2026-02-17T13:26:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.860071 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.860182 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.860474 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.860609 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.860693 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:18Z","lastTransitionTime":"2026-02-17T13:26:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.861756 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.878776 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.891471 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:18Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.912692 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:18Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.928063 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:18Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.944067 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:18Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.961994 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:18Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.964366 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.964550 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.965155 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.965490 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.965905 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:18Z","lastTransitionTime":"2026-02-17T13:26:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.979981 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:18Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:18 crc kubenswrapper[4804]: I0217 13:26:18.992621 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:18Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.004226 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:19Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.019799 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:19Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.040470 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:15Z\\\",\\\"message\\\":\\\" 6456 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661364 6456 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661403 6456 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661438 6456 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.671636 6456 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 13:26:15.671766 6456 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 13:26:15.671804 6456 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 13:26:15.671975 6456 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 13:26:15.671988 6456 factory.go:656] Stopping watch factory\\\\nI0217 13:26:15.690620 6456 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0217 13:26:15.690795 6456 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0217 13:26:15.690976 6456 ovnkube.go:599] Stopped ovnkube\\\\nI0217 13:26:15.691073 6456 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 13:26:15.691307 6456 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:26:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:19Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.053660 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c668758ab1ef4e6a8e2001ed6d70ec830b00746cc1454488df271b664e43024a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4c208ee4d38a7bd533fcc639d66957e66782fd5e5e8d7ec7a424b52bc4808b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:19Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.069460 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.069928 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.069959 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.069983 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.070028 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:19Z","lastTransitionTime":"2026-02-17T13:26:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.072571 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:19Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.087434 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:19Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.101333 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:19Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.115464 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:19Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.130538 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77722ba-d383-442c-b6dc-9983cf233257\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jfgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:19Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.173607 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.173660 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.173678 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.173700 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.173713 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:19Z","lastTransitionTime":"2026-02-17T13:26:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.270162 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:26:19 crc kubenswrapper[4804]: E0217 13:26:19.270445 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:26:51.270393882 +0000 UTC m=+85.381813279 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.271054 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.271280 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.271555 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.271772 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:19 crc kubenswrapper[4804]: E0217 13:26:19.271398 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 13:26:19 crc kubenswrapper[4804]: E0217 13:26:19.272132 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 13:26:19 crc kubenswrapper[4804]: E0217 13:26:19.272338 4804 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:26:19 crc kubenswrapper[4804]: E0217 13:26:19.272593 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 13:26:51.272571404 +0000 UTC m=+85.383990781 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:26:19 crc kubenswrapper[4804]: E0217 13:26:19.271507 4804 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 13:26:19 crc kubenswrapper[4804]: E0217 13:26:19.271810 4804 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 13:26:19 crc kubenswrapper[4804]: E0217 13:26:19.271929 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 13:26:19 crc kubenswrapper[4804]: E0217 13:26:19.273069 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 13:26:51.273034788 +0000 UTC m=+85.384454165 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 13:26:19 crc kubenswrapper[4804]: E0217 13:26:19.273335 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 13:26:19 crc kubenswrapper[4804]: E0217 13:26:19.273529 4804 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:26:19 crc kubenswrapper[4804]: E0217 13:26:19.273511 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 13:26:51.273479383 +0000 UTC m=+85.384898790 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 13:26:19 crc kubenswrapper[4804]: E0217 13:26:19.273891 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 13:26:51.273865075 +0000 UTC m=+85.385284532 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.276352 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.276428 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.276452 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.276489 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.276516 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:19Z","lastTransitionTime":"2026-02-17T13:26:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.380312 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.380375 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.380395 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.380423 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.380442 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:19Z","lastTransitionTime":"2026-02-17T13:26:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.484023 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.484118 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.484144 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.484233 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.484267 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:19Z","lastTransitionTime":"2026-02-17T13:26:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.548118 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 00:20:06.948393627 +0000 UTC Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.573716 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.573837 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.574011 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:19 crc kubenswrapper[4804]: E0217 13:26:19.574097 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:19 crc kubenswrapper[4804]: E0217 13:26:19.574254 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:19 crc kubenswrapper[4804]: E0217 13:26:19.574536 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.573872 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:19 crc kubenswrapper[4804]: E0217 13:26:19.574669 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.587329 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.587408 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.587422 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.587440 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.587490 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:19Z","lastTransitionTime":"2026-02-17T13:26:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.690217 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.690255 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.690263 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.690277 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.690285 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:19Z","lastTransitionTime":"2026-02-17T13:26:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.792960 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.793316 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.793441 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.793523 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.793592 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:19Z","lastTransitionTime":"2026-02-17T13:26:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.896247 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.896303 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.896319 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.896343 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.896359 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:19Z","lastTransitionTime":"2026-02-17T13:26:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.998755 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.998836 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.998848 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.998868 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:19 crc kubenswrapper[4804]: I0217 13:26:19.998880 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:19Z","lastTransitionTime":"2026-02-17T13:26:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.102291 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.102432 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.102462 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.102503 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.102551 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:20Z","lastTransitionTime":"2026-02-17T13:26:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.206033 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.206120 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.206139 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.206171 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.206191 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:20Z","lastTransitionTime":"2026-02-17T13:26:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.309826 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.310167 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.310262 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.310338 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.310419 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:20Z","lastTransitionTime":"2026-02-17T13:26:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.417890 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.417950 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.417966 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.417994 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.418013 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:20Z","lastTransitionTime":"2026-02-17T13:26:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.522110 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.522153 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.522164 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.522186 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.522218 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:20Z","lastTransitionTime":"2026-02-17T13:26:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.549495 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 00:22:13.200446638 +0000 UTC Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.625462 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.625703 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.625712 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.625730 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.625740 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:20Z","lastTransitionTime":"2026-02-17T13:26:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.729434 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.729496 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.729542 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.729574 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.729594 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:20Z","lastTransitionTime":"2026-02-17T13:26:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.832187 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.832254 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.832268 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.832290 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.832306 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:20Z","lastTransitionTime":"2026-02-17T13:26:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.935090 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.935144 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.935154 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.935175 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:20 crc kubenswrapper[4804]: I0217 13:26:20.935187 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:20Z","lastTransitionTime":"2026-02-17T13:26:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.038222 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.038279 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.038290 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.038312 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.038326 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:21Z","lastTransitionTime":"2026-02-17T13:26:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.140519 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.140563 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.140574 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.140595 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.140605 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:21Z","lastTransitionTime":"2026-02-17T13:26:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.243477 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.243527 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.243540 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.243563 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.243575 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:21Z","lastTransitionTime":"2026-02-17T13:26:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.346508 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.346560 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.346569 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.346591 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.346602 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:21Z","lastTransitionTime":"2026-02-17T13:26:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.449218 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.449254 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.449264 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.449282 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.449293 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:21Z","lastTransitionTime":"2026-02-17T13:26:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.550059 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 23:09:56.470438418 +0000 UTC Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.552478 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.552529 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.552557 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.552594 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.552613 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:21Z","lastTransitionTime":"2026-02-17T13:26:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.573742 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.573774 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.573787 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:21 crc kubenswrapper[4804]: E0217 13:26:21.573868 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.573742 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:21 crc kubenswrapper[4804]: E0217 13:26:21.574071 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:21 crc kubenswrapper[4804]: E0217 13:26:21.574263 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:21 crc kubenswrapper[4804]: E0217 13:26:21.574448 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.655531 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.655578 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.655593 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.655612 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.655625 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:21Z","lastTransitionTime":"2026-02-17T13:26:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.759213 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.759301 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.759316 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.759341 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.759355 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:21Z","lastTransitionTime":"2026-02-17T13:26:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.862350 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.862400 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.862412 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.862435 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.862451 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:21Z","lastTransitionTime":"2026-02-17T13:26:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.966283 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.966381 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.966403 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.966448 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:21 crc kubenswrapper[4804]: I0217 13:26:21.966469 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:21Z","lastTransitionTime":"2026-02-17T13:26:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.070176 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.070234 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.070248 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.070269 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.070282 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:22Z","lastTransitionTime":"2026-02-17T13:26:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.172976 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.173022 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.173034 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.173055 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.173070 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:22Z","lastTransitionTime":"2026-02-17T13:26:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.276424 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.276480 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.276495 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.276521 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.276538 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:22Z","lastTransitionTime":"2026-02-17T13:26:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.379501 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.379550 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.379560 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.379579 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.379590 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:22Z","lastTransitionTime":"2026-02-17T13:26:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.482362 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.482405 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.482417 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.482436 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.482446 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:22Z","lastTransitionTime":"2026-02-17T13:26:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.550944 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 10:50:11.10991498 +0000 UTC Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.584828 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.584884 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.584896 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.584924 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.584939 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:22Z","lastTransitionTime":"2026-02-17T13:26:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.692437 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.692468 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.692476 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.692494 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.692504 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:22Z","lastTransitionTime":"2026-02-17T13:26:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.795258 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.795306 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.795319 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.795341 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.795353 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:22Z","lastTransitionTime":"2026-02-17T13:26:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.898439 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.898555 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.898587 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.898628 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:22 crc kubenswrapper[4804]: I0217 13:26:22.898652 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:22Z","lastTransitionTime":"2026-02-17T13:26:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.001948 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.002018 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.002041 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.002079 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.002101 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:23Z","lastTransitionTime":"2026-02-17T13:26:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.106052 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.106114 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.106131 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.106154 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.106170 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:23Z","lastTransitionTime":"2026-02-17T13:26:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.208265 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.208308 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.208317 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.208335 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.208347 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:23Z","lastTransitionTime":"2026-02-17T13:26:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.312120 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.312178 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.312189 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.312234 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.312254 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:23Z","lastTransitionTime":"2026-02-17T13:26:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.416141 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.416188 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.416221 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.416243 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.416256 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:23Z","lastTransitionTime":"2026-02-17T13:26:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.520461 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.520545 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.520565 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.520593 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.520610 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:23Z","lastTransitionTime":"2026-02-17T13:26:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.551933 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 03:51:05.525021297 +0000 UTC Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.573450 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.573579 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.573625 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.573688 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:23 crc kubenswrapper[4804]: E0217 13:26:23.573718 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:23 crc kubenswrapper[4804]: E0217 13:26:23.573878 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:23 crc kubenswrapper[4804]: E0217 13:26:23.573973 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:23 crc kubenswrapper[4804]: E0217 13:26:23.574074 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.623289 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.623338 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.623350 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.623370 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.623383 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:23Z","lastTransitionTime":"2026-02-17T13:26:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.726858 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.726973 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.727006 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.727068 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.727098 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:23Z","lastTransitionTime":"2026-02-17T13:26:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.832888 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.832942 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.832953 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.832970 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.832982 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:23Z","lastTransitionTime":"2026-02-17T13:26:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.935852 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.936126 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.936245 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.936389 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:23 crc kubenswrapper[4804]: I0217 13:26:23.936466 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:23Z","lastTransitionTime":"2026-02-17T13:26:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.039775 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.039828 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.039840 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.039864 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.039879 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:24Z","lastTransitionTime":"2026-02-17T13:26:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.143301 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.143360 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.143379 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.143407 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.143423 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:24Z","lastTransitionTime":"2026-02-17T13:26:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.246846 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.246925 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.246952 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.246986 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.247009 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:24Z","lastTransitionTime":"2026-02-17T13:26:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.350368 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.350468 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.350503 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.350535 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.350556 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:24Z","lastTransitionTime":"2026-02-17T13:26:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.454861 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.454925 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.454943 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.454973 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.454996 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:24Z","lastTransitionTime":"2026-02-17T13:26:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.552930 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 01:26:45.856957319 +0000 UTC Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.558512 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.558560 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.558572 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.558594 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.558613 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:24Z","lastTransitionTime":"2026-02-17T13:26:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.662467 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.662568 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.662580 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.662598 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.662610 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:24Z","lastTransitionTime":"2026-02-17T13:26:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.821620 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.821715 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.821746 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.821793 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.821813 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:24Z","lastTransitionTime":"2026-02-17T13:26:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.924854 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.924941 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.924961 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.924997 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:24 crc kubenswrapper[4804]: I0217 13:26:24.925024 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:24Z","lastTransitionTime":"2026-02-17T13:26:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.025626 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.025686 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.025700 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.025723 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.025737 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:25Z","lastTransitionTime":"2026-02-17T13:26:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:25 crc kubenswrapper[4804]: E0217 13:26:25.053388 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:25Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.059421 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.059490 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.059507 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.059542 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.059589 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:25Z","lastTransitionTime":"2026-02-17T13:26:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:25 crc kubenswrapper[4804]: E0217 13:26:25.076059 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:25Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.082730 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.082790 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.082808 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.082833 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.082881 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:25Z","lastTransitionTime":"2026-02-17T13:26:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:25 crc kubenswrapper[4804]: E0217 13:26:25.100245 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:25Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.105915 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.105969 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.105982 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.106004 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.106040 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:25Z","lastTransitionTime":"2026-02-17T13:26:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:25 crc kubenswrapper[4804]: E0217 13:26:25.123117 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:25Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.127766 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.127829 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.127843 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.127865 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.127902 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:25Z","lastTransitionTime":"2026-02-17T13:26:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:25 crc kubenswrapper[4804]: E0217 13:26:25.143991 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:25Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:25 crc kubenswrapper[4804]: E0217 13:26:25.144107 4804 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.146093 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.146117 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.146126 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.146143 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.146153 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:25Z","lastTransitionTime":"2026-02-17T13:26:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.248470 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.248523 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.248538 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.248560 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.248574 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:25Z","lastTransitionTime":"2026-02-17T13:26:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.351364 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.351427 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.351442 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.351466 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.351483 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:25Z","lastTransitionTime":"2026-02-17T13:26:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.453770 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.453941 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.453971 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.454054 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.454127 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:25Z","lastTransitionTime":"2026-02-17T13:26:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.553322 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 10:15:05.156920557 +0000 UTC Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.558516 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.558587 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.558600 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.558620 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.558632 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:25Z","lastTransitionTime":"2026-02-17T13:26:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.574291 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.574306 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:25 crc kubenswrapper[4804]: E0217 13:26:25.574604 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:25 crc kubenswrapper[4804]: E0217 13:26:25.574649 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.574302 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.574309 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:25 crc kubenswrapper[4804]: E0217 13:26:25.574753 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:25 crc kubenswrapper[4804]: E0217 13:26:25.574940 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.661341 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.661389 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.661400 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.661421 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.661433 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:25Z","lastTransitionTime":"2026-02-17T13:26:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.765842 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.765912 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.765933 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.765961 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.765977 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:25Z","lastTransitionTime":"2026-02-17T13:26:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.869109 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.869239 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.869280 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.869319 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.869351 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:25Z","lastTransitionTime":"2026-02-17T13:26:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.972101 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.972145 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.972158 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.972181 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:25 crc kubenswrapper[4804]: I0217 13:26:25.972219 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:25Z","lastTransitionTime":"2026-02-17T13:26:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.075715 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.075795 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.075817 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.075852 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.075876 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:26Z","lastTransitionTime":"2026-02-17T13:26:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.179991 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.180102 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.180131 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.180178 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.180262 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:26Z","lastTransitionTime":"2026-02-17T13:26:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.283979 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.284063 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.284081 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.284112 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.284136 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:26Z","lastTransitionTime":"2026-02-17T13:26:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.387001 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.387097 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.387125 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.387163 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.387191 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:26Z","lastTransitionTime":"2026-02-17T13:26:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.491271 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.491350 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.491366 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.491394 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.491410 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:26Z","lastTransitionTime":"2026-02-17T13:26:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.553747 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 02:26:51.707562936 +0000 UTC Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.600172 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:26Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.606536 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.606598 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.606613 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.606635 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.606649 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:26Z","lastTransitionTime":"2026-02-17T13:26:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.615863 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:26Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.632277 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:26Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.650685 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:26Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.665924 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:26Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.681364 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40bdd16b-bf07-4c36-80dd-66372f796f39\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dbdcf29ae01c66c5b1c3ed9c4aa7b3a6451bb49e10d52c319a2744105303386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b261eeb544e51ee2adb543542186c18759e667e1339fd724a3dce82b0391aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8a58720e24f8e51cc0749b46bb8ae7a1ecd576acaf200cce798a109e2e46fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2677496c02d25e670b94e00ddc43760e48755555d16ea7c6fb56c8bbebc19a44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2677496c02d25e670b94e00ddc43760e48755555d16ea7c6fb56c8bbebc19a44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:26Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.697865 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:26Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.709103 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.709152 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.709162 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.709182 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.709210 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:26Z","lastTransitionTime":"2026-02-17T13:26:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.712840 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:26Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.729816 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:26Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.746987 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:26Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.773417 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:15Z\\\",\\\"message\\\":\\\" 6456 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661364 6456 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661403 6456 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661438 6456 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.671636 6456 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 13:26:15.671766 6456 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 13:26:15.671804 6456 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 13:26:15.671975 6456 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 13:26:15.671988 6456 factory.go:656] Stopping watch factory\\\\nI0217 13:26:15.690620 6456 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0217 13:26:15.690795 6456 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0217 13:26:15.690976 6456 ovnkube.go:599] Stopped ovnkube\\\\nI0217 13:26:15.691073 6456 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 13:26:15.691307 6456 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:26:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:26Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.791413 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c668758ab1ef4e6a8e2001ed6d70ec830b00746cc1454488df271b664e43024a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4c208ee4d38a7bd533fcc639d66957e66782fd5e5e8d7ec7a424b52bc4808b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:26Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.805354 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77722ba-d383-442c-b6dc-9983cf233257\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jfgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:26Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.811710 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.811811 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.811841 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.811876 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.811904 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:26Z","lastTransitionTime":"2026-02-17T13:26:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.828150 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:26Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.844489 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:26Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.859344 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:26Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.873033 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:26Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.915277 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.915332 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.915343 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.915363 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:26 crc kubenswrapper[4804]: I0217 13:26:26.915378 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:26Z","lastTransitionTime":"2026-02-17T13:26:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.018357 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.018645 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.018769 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.018837 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.018927 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:27Z","lastTransitionTime":"2026-02-17T13:26:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.122666 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.122719 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.122736 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.122760 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.122777 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:27Z","lastTransitionTime":"2026-02-17T13:26:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.225025 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.225110 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.225134 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.225166 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.225187 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:27Z","lastTransitionTime":"2026-02-17T13:26:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.329368 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.329469 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.329496 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.329536 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.329559 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:27Z","lastTransitionTime":"2026-02-17T13:26:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.434135 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.434250 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.434272 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.434307 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.434327 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:27Z","lastTransitionTime":"2026-02-17T13:26:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.538676 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.538861 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.538951 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.538986 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.539011 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:27Z","lastTransitionTime":"2026-02-17T13:26:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.554383 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 16:27:22.639784046 +0000 UTC Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.573534 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.573534 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.573553 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.573680 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:27 crc kubenswrapper[4804]: E0217 13:26:27.573876 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:27 crc kubenswrapper[4804]: E0217 13:26:27.574179 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:27 crc kubenswrapper[4804]: E0217 13:26:27.574383 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:27 crc kubenswrapper[4804]: E0217 13:26:27.574499 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.642299 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.642399 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.642418 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.642448 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.642466 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:27Z","lastTransitionTime":"2026-02-17T13:26:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.745786 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.745867 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.745888 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.745938 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.745985 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:27Z","lastTransitionTime":"2026-02-17T13:26:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.850965 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.851325 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.851504 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.852345 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.852373 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:27Z","lastTransitionTime":"2026-02-17T13:26:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.956721 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.956786 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.956809 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.956838 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:27 crc kubenswrapper[4804]: I0217 13:26:27.956856 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:27Z","lastTransitionTime":"2026-02-17T13:26:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.060188 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.060278 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.060299 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.060328 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.060352 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:28Z","lastTransitionTime":"2026-02-17T13:26:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.163545 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.163604 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.163626 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.163652 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.163670 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:28Z","lastTransitionTime":"2026-02-17T13:26:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.266715 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.266784 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.266793 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.266810 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.266821 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:28Z","lastTransitionTime":"2026-02-17T13:26:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.369628 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.369686 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.369701 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.369723 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.369736 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:28Z","lastTransitionTime":"2026-02-17T13:26:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.473658 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.473726 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.473745 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.473773 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.473793 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:28Z","lastTransitionTime":"2026-02-17T13:26:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.554810 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 17:56:39.714316412 +0000 UTC Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.576523 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.576616 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.576641 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.576664 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.576681 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:28Z","lastTransitionTime":"2026-02-17T13:26:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.577358 4804 scope.go:117] "RemoveContainer" containerID="e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701" Feb 17 13:26:28 crc kubenswrapper[4804]: E0217 13:26:28.577619 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.679630 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.679689 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.679703 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.679726 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.679740 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:28Z","lastTransitionTime":"2026-02-17T13:26:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.782685 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.782741 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.782752 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.782773 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.782784 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:28Z","lastTransitionTime":"2026-02-17T13:26:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.884810 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.884859 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.884870 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.884889 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.884910 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:28Z","lastTransitionTime":"2026-02-17T13:26:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.988686 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.988781 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.988813 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.988846 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:28 crc kubenswrapper[4804]: I0217 13:26:28.988868 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:28Z","lastTransitionTime":"2026-02-17T13:26:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.091368 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.091423 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.091435 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.091473 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.091492 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:29Z","lastTransitionTime":"2026-02-17T13:26:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.194364 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.194404 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.194415 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.194431 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.194441 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:29Z","lastTransitionTime":"2026-02-17T13:26:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.296920 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.296975 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.296991 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.297013 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.297026 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:29Z","lastTransitionTime":"2026-02-17T13:26:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.399865 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.399903 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.399912 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.399928 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.399938 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:29Z","lastTransitionTime":"2026-02-17T13:26:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.502229 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.502269 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.502280 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.502297 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.502310 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:29Z","lastTransitionTime":"2026-02-17T13:26:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.554924 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 20:31:24.398254821 +0000 UTC Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.573352 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.573434 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.573444 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.573447 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:29 crc kubenswrapper[4804]: E0217 13:26:29.573621 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:29 crc kubenswrapper[4804]: E0217 13:26:29.573875 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:29 crc kubenswrapper[4804]: E0217 13:26:29.574020 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:29 crc kubenswrapper[4804]: E0217 13:26:29.574160 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.604896 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.604953 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.604974 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.604995 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.605010 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:29Z","lastTransitionTime":"2026-02-17T13:26:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.708292 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.708348 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.708360 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.708381 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.708392 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:29Z","lastTransitionTime":"2026-02-17T13:26:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.811351 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.811427 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.811443 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.811470 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.811486 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:29Z","lastTransitionTime":"2026-02-17T13:26:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.914758 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.914827 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.914844 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.914869 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:29 crc kubenswrapper[4804]: I0217 13:26:29.914883 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:29Z","lastTransitionTime":"2026-02-17T13:26:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.017949 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.018012 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.018027 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.018048 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.018060 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:30Z","lastTransitionTime":"2026-02-17T13:26:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.120277 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.120317 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.120327 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.120345 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.120358 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:30Z","lastTransitionTime":"2026-02-17T13:26:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.222848 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.222915 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.222927 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.222945 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.222957 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:30Z","lastTransitionTime":"2026-02-17T13:26:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.326408 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.326468 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.326478 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.326499 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.326514 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:30Z","lastTransitionTime":"2026-02-17T13:26:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.429575 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.429627 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.429639 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.429662 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.429673 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:30Z","lastTransitionTime":"2026-02-17T13:26:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.532519 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.532565 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.532575 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.532596 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.532606 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:30Z","lastTransitionTime":"2026-02-17T13:26:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.556060 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 12:45:51.069984866 +0000 UTC Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.635134 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.635187 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.635225 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.635247 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.635262 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:30Z","lastTransitionTime":"2026-02-17T13:26:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.738503 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.738559 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.738572 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.738595 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.738608 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:30Z","lastTransitionTime":"2026-02-17T13:26:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.842018 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.842077 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.842091 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.842115 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.842128 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:30Z","lastTransitionTime":"2026-02-17T13:26:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.948781 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.948840 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.948865 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.948892 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:30 crc kubenswrapper[4804]: I0217 13:26:30.948908 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:30Z","lastTransitionTime":"2026-02-17T13:26:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.055451 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.055767 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.055885 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.055959 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.056026 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:31Z","lastTransitionTime":"2026-02-17T13:26:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.159405 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.159781 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.159995 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.160102 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.160187 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:31Z","lastTransitionTime":"2026-02-17T13:26:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.262809 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.262886 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.262898 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.262937 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.262951 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:31Z","lastTransitionTime":"2026-02-17T13:26:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.366311 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.366361 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.366374 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.366395 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.366410 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:31Z","lastTransitionTime":"2026-02-17T13:26:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.470163 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.470255 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.470271 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.470296 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.470314 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:31Z","lastTransitionTime":"2026-02-17T13:26:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.556370 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 02:22:40.198698764 +0000 UTC Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.573484 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.573528 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.573667 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:31 crc kubenswrapper[4804]: E0217 13:26:31.573770 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.573815 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.573849 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.573868 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.573894 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.573909 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:31Z","lastTransitionTime":"2026-02-17T13:26:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:31 crc kubenswrapper[4804]: E0217 13:26:31.573990 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.574078 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:31 crc kubenswrapper[4804]: E0217 13:26:31.574107 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:31 crc kubenswrapper[4804]: E0217 13:26:31.574301 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.676955 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.677008 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.677018 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.677033 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.677043 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:31Z","lastTransitionTime":"2026-02-17T13:26:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.779502 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.779548 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.779561 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.779579 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.779592 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:31Z","lastTransitionTime":"2026-02-17T13:26:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.882120 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.882166 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.882178 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.882225 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.882243 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:31Z","lastTransitionTime":"2026-02-17T13:26:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.986132 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.986241 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.986273 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.986302 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:31 crc kubenswrapper[4804]: I0217 13:26:31.986324 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:31Z","lastTransitionTime":"2026-02-17T13:26:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.090012 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.090076 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.090089 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.090113 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.090125 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:32Z","lastTransitionTime":"2026-02-17T13:26:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.193402 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.193457 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.193473 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.193492 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.193506 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:32Z","lastTransitionTime":"2026-02-17T13:26:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.296012 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.296063 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.296078 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.296105 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.296118 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:32Z","lastTransitionTime":"2026-02-17T13:26:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.399695 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.399784 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.399799 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.399821 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.399836 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:32Z","lastTransitionTime":"2026-02-17T13:26:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.503539 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.503586 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.503597 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.503619 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.503632 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:32Z","lastTransitionTime":"2026-02-17T13:26:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.556585 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 02:16:08.050937545 +0000 UTC Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.607093 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.607157 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.607176 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.607236 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.607261 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:32Z","lastTransitionTime":"2026-02-17T13:26:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.709841 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.709931 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.709951 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.709981 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.710000 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:32Z","lastTransitionTime":"2026-02-17T13:26:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.812902 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.812969 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.812988 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.813020 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.813039 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:32Z","lastTransitionTime":"2026-02-17T13:26:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.915838 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.915897 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.915911 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.915934 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:32 crc kubenswrapper[4804]: I0217 13:26:32.915951 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:32Z","lastTransitionTime":"2026-02-17T13:26:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.018978 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.019026 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.019040 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.019060 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.019073 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:33Z","lastTransitionTime":"2026-02-17T13:26:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.122580 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.122620 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.122635 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.122655 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.122668 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:33Z","lastTransitionTime":"2026-02-17T13:26:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.227364 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.227489 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.227505 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.227574 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.227590 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:33Z","lastTransitionTime":"2026-02-17T13:26:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.330571 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.330642 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.330660 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.330692 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.330711 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:33Z","lastTransitionTime":"2026-02-17T13:26:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.434365 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.434469 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.434523 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.434576 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.434707 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:33Z","lastTransitionTime":"2026-02-17T13:26:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.538857 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.538925 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.538941 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.538971 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.538984 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:33Z","lastTransitionTime":"2026-02-17T13:26:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.557417 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 16:37:48.081921087 +0000 UTC Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.573838 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.573896 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.573855 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:33 crc kubenswrapper[4804]: E0217 13:26:33.574105 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.574132 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:33 crc kubenswrapper[4804]: E0217 13:26:33.574297 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:33 crc kubenswrapper[4804]: E0217 13:26:33.574662 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:33 crc kubenswrapper[4804]: E0217 13:26:33.574770 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.641310 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.641345 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.641357 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.641374 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.641385 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:33Z","lastTransitionTime":"2026-02-17T13:26:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.743741 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.743805 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.743817 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.743838 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.743850 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:33Z","lastTransitionTime":"2026-02-17T13:26:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:33 crc kubenswrapper[4804]: E0217 13:26:33.743999 4804 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 13:26:33 crc kubenswrapper[4804]: E0217 13:26:33.744073 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs podName:e77722ba-d383-442c-b6dc-9983cf233257 nodeName:}" failed. No retries permitted until 2026-02-17 13:27:05.744054928 +0000 UTC m=+99.855474275 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs") pod "network-metrics-daemon-4jfgm" (UID: "e77722ba-d383-442c-b6dc-9983cf233257") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.743859 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs\") pod \"network-metrics-daemon-4jfgm\" (UID: \"e77722ba-d383-442c-b6dc-9983cf233257\") " pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.846557 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.846604 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.846617 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.846641 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.846656 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:33Z","lastTransitionTime":"2026-02-17T13:26:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.949184 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.949730 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.949987 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.950069 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:33 crc kubenswrapper[4804]: I0217 13:26:33.950139 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:33Z","lastTransitionTime":"2026-02-17T13:26:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.053282 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.053620 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.053681 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.053791 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.053866 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:34Z","lastTransitionTime":"2026-02-17T13:26:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.156470 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.156521 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.156534 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.156558 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.156570 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:34Z","lastTransitionTime":"2026-02-17T13:26:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.258667 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.258725 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.258740 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.258767 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.258783 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:34Z","lastTransitionTime":"2026-02-17T13:26:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.361744 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.362172 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.362357 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.362519 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.362734 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:34Z","lastTransitionTime":"2026-02-17T13:26:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.467974 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.468022 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.468033 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.468055 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.468066 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:34Z","lastTransitionTime":"2026-02-17T13:26:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.558601 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 20:37:33.509870398 +0000 UTC Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.570841 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.570901 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.570916 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.570944 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.570959 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:34Z","lastTransitionTime":"2026-02-17T13:26:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.674264 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.674340 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.674380 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.674403 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.674416 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:34Z","lastTransitionTime":"2026-02-17T13:26:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.777408 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.777459 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.777471 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.777491 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.777507 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:34Z","lastTransitionTime":"2026-02-17T13:26:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.880523 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.880570 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.880587 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.880608 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.880619 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:34Z","lastTransitionTime":"2026-02-17T13:26:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.984043 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.984099 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.984113 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.984137 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:34 crc kubenswrapper[4804]: I0217 13:26:34.984151 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:34Z","lastTransitionTime":"2026-02-17T13:26:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.091568 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.091616 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.091629 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.091649 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.091662 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:35Z","lastTransitionTime":"2026-02-17T13:26:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.193913 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.193958 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.193970 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.193989 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.194000 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:35Z","lastTransitionTime":"2026-02-17T13:26:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.296139 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.296219 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.296233 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.296251 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.296265 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:35Z","lastTransitionTime":"2026-02-17T13:26:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.399136 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.399291 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.399315 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.399352 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.399391 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:35Z","lastTransitionTime":"2026-02-17T13:26:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.464868 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.464920 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.464934 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.464958 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.464973 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:35Z","lastTransitionTime":"2026-02-17T13:26:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:35 crc kubenswrapper[4804]: E0217 13:26:35.478573 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:35Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.483940 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.483989 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.483999 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.484017 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.484027 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:35Z","lastTransitionTime":"2026-02-17T13:26:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:35 crc kubenswrapper[4804]: E0217 13:26:35.497452 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:35Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.501356 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.501651 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.501833 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.502215 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.502755 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:35Z","lastTransitionTime":"2026-02-17T13:26:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:35 crc kubenswrapper[4804]: E0217 13:26:35.517127 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:35Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.521505 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.521637 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.521712 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.521794 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.521865 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:35Z","lastTransitionTime":"2026-02-17T13:26:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:35 crc kubenswrapper[4804]: E0217 13:26:35.536694 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:35Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.542106 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.542149 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.542160 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.542192 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.542231 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:35Z","lastTransitionTime":"2026-02-17T13:26:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.559192 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 14:51:23.943957618 +0000 UTC Feb 17 13:26:35 crc kubenswrapper[4804]: E0217 13:26:35.559899 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:35Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:35 crc kubenswrapper[4804]: E0217 13:26:35.560070 4804 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.562372 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.562434 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.562450 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.562476 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.562493 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:35Z","lastTransitionTime":"2026-02-17T13:26:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.572999 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.572999 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:35 crc kubenswrapper[4804]: E0217 13:26:35.573176 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.573013 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:35 crc kubenswrapper[4804]: E0217 13:26:35.573274 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:35 crc kubenswrapper[4804]: E0217 13:26:35.573344 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.573029 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:35 crc kubenswrapper[4804]: E0217 13:26:35.573457 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.665969 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.666053 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.666079 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.666115 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.666140 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:35Z","lastTransitionTime":"2026-02-17T13:26:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.769366 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.769413 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.769425 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.769444 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.769456 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:35Z","lastTransitionTime":"2026-02-17T13:26:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.872104 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.872146 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.872156 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.872174 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.872185 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:35Z","lastTransitionTime":"2026-02-17T13:26:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.975503 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.975551 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.975566 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.975590 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:35 crc kubenswrapper[4804]: I0217 13:26:35.975606 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:35Z","lastTransitionTime":"2026-02-17T13:26:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.078303 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.078353 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.078362 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.078384 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.078396 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:36Z","lastTransitionTime":"2026-02-17T13:26:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.181389 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.181452 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.181465 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.181485 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.181497 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:36Z","lastTransitionTime":"2026-02-17T13:26:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.255412 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kclvs_42eec48d-c990-43e6-8348-d9f78997ec3b/kube-multus/0.log" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.255468 4804 generic.go:334] "Generic (PLEG): container finished" podID="42eec48d-c990-43e6-8348-d9f78997ec3b" containerID="26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa" exitCode=1 Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.255511 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kclvs" event={"ID":"42eec48d-c990-43e6-8348-d9f78997ec3b","Type":"ContainerDied","Data":"26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa"} Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.256033 4804 scope.go:117] "RemoveContainer" containerID="26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.273089 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.285750 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.286065 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.286094 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.286177 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.286328 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:36Z","lastTransitionTime":"2026-02-17T13:26:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.301662 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:15Z\\\",\\\"message\\\":\\\" 6456 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661364 6456 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661403 6456 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661438 6456 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.671636 6456 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 13:26:15.671766 6456 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 13:26:15.671804 6456 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 13:26:15.671975 6456 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 13:26:15.671988 6456 factory.go:656] Stopping watch factory\\\\nI0217 13:26:15.690620 6456 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0217 13:26:15.690795 6456 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0217 13:26:15.690976 6456 ovnkube.go:599] Stopped ovnkube\\\\nI0217 13:26:15.691073 6456 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 13:26:15.691307 6456 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:26:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.314692 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c668758ab1ef4e6a8e2001ed6d70ec830b00746cc1454488df271b664e43024a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4c208ee4d38a7bd533fcc639d66957e66782fd5e5e8d7ec7a424b52bc4808b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.332056 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.348693 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.360656 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.373707 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77722ba-d383-442c-b6dc-9983cf233257\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jfgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.390033 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.390071 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.390081 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.390101 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.390111 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:36Z","lastTransitionTime":"2026-02-17T13:26:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.390883 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.405260 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40bdd16b-bf07-4c36-80dd-66372f796f39\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dbdcf29ae01c66c5b1c3ed9c4aa7b3a6451bb49e10d52c319a2744105303386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b261eeb544e51ee2adb543542186c18759e667e1339fd724a3dce82b0391aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8a58720e24f8e51cc0749b46bb8ae7a1ecd576acaf200cce798a109e2e46fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2677496c02d25e670b94e00ddc43760e48755555d16ea7c6fb56c8bbebc19a44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2677496c02d25e670b94e00ddc43760e48755555d16ea7c6fb56c8bbebc19a44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.420054 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.439895 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.458308 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.486550 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"2026-02-17T13:25:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_fb1e1c63-e2f0-4d40-a150-555d53c42307\\\\n2026-02-17T13:25:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_fb1e1c63-e2f0-4d40-a150-555d53c42307 to /host/opt/cni/bin/\\\\n2026-02-17T13:25:50Z [verbose] multus-daemon started\\\\n2026-02-17T13:25:50Z [verbose] Readiness Indicator file check\\\\n2026-02-17T13:26:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.492471 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.492511 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.492521 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.492537 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.492550 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:36Z","lastTransitionTime":"2026-02-17T13:26:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.512162 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.527939 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.540840 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.553496 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.559756 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 20:20:01.229515341 +0000 UTC Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.589461 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.595491 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.595540 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.595553 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.595575 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.595590 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:36Z","lastTransitionTime":"2026-02-17T13:26:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.611122 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:15Z\\\",\\\"message\\\":\\\" 6456 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661364 6456 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661403 6456 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661438 6456 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.671636 6456 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 13:26:15.671766 6456 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 13:26:15.671804 6456 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 13:26:15.671975 6456 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 13:26:15.671988 6456 factory.go:656] Stopping watch factory\\\\nI0217 13:26:15.690620 6456 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0217 13:26:15.690795 6456 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0217 13:26:15.690976 6456 ovnkube.go:599] Stopped ovnkube\\\\nI0217 13:26:15.691073 6456 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 13:26:15.691307 6456 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:26:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.625838 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c668758ab1ef4e6a8e2001ed6d70ec830b00746cc1454488df271b664e43024a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4c208ee4d38a7bd533fcc639d66957e66782fd5e5e8d7ec7a424b52bc4808b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.646176 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.659401 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.671663 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.686977 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77722ba-d383-442c-b6dc-9983cf233257\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jfgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.697386 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.697414 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.697425 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.697442 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.697452 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:36Z","lastTransitionTime":"2026-02-17T13:26:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.704953 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.718682 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40bdd16b-bf07-4c36-80dd-66372f796f39\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dbdcf29ae01c66c5b1c3ed9c4aa7b3a6451bb49e10d52c319a2744105303386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b261eeb544e51ee2adb543542186c18759e667e1339fd724a3dce82b0391aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8a58720e24f8e51cc0749b46bb8ae7a1ecd576acaf200cce798a109e2e46fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2677496c02d25e670b94e00ddc43760e48755555d16ea7c6fb56c8bbebc19a44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2677496c02d25e670b94e00ddc43760e48755555d16ea7c6fb56c8bbebc19a44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.732304 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.747334 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.762029 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.776775 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"2026-02-17T13:25:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_fb1e1c63-e2f0-4d40-a150-555d53c42307\\\\n2026-02-17T13:25:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_fb1e1c63-e2f0-4d40-a150-555d53c42307 to /host/opt/cni/bin/\\\\n2026-02-17T13:25:50Z [verbose] multus-daemon started\\\\n2026-02-17T13:25:50Z [verbose] Readiness Indicator file check\\\\n2026-02-17T13:26:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.793594 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.799509 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.799562 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.799578 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.799602 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.799617 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:36Z","lastTransitionTime":"2026-02-17T13:26:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.810826 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.825825 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.841362 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:36Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.902089 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.902171 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.902184 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.902222 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:36 crc kubenswrapper[4804]: I0217 13:26:36.902233 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:36Z","lastTransitionTime":"2026-02-17T13:26:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.005351 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.005409 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.005422 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.005443 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.005453 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:37Z","lastTransitionTime":"2026-02-17T13:26:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.108247 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.108285 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.108295 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.108309 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.108324 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:37Z","lastTransitionTime":"2026-02-17T13:26:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.210770 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.210825 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.210835 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.210854 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.210866 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:37Z","lastTransitionTime":"2026-02-17T13:26:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.260991 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kclvs_42eec48d-c990-43e6-8348-d9f78997ec3b/kube-multus/0.log" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.261071 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kclvs" event={"ID":"42eec48d-c990-43e6-8348-d9f78997ec3b","Type":"ContainerStarted","Data":"2df81c301857cf67d2883ea019d2d5fbd31c4c5dea2d9b5c4bfb19302b4cb03a"} Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.275607 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:37Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.288541 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:37Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.300161 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:37Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.314605 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.314643 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.314656 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.314675 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.314688 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:37Z","lastTransitionTime":"2026-02-17T13:26:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.349833 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:15Z\\\",\\\"message\\\":\\\" 6456 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661364 6456 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661403 6456 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661438 6456 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.671636 6456 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 13:26:15.671766 6456 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 13:26:15.671804 6456 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 13:26:15.671975 6456 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 13:26:15.671988 6456 factory.go:656] Stopping watch factory\\\\nI0217 13:26:15.690620 6456 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0217 13:26:15.690795 6456 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0217 13:26:15.690976 6456 ovnkube.go:599] Stopped ovnkube\\\\nI0217 13:26:15.691073 6456 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 13:26:15.691307 6456 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:26:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:37Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.366075 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c668758ab1ef4e6a8e2001ed6d70ec830b00746cc1454488df271b664e43024a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4c208ee4d38a7bd533fcc639d66957e66782fd5e5e8d7ec7a424b52bc4808b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:37Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.383932 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:37Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.397516 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:37Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.410158 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:37Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.417431 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.417671 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.417857 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.418335 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.418566 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:37Z","lastTransitionTime":"2026-02-17T13:26:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.421153 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:37Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.432037 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77722ba-d383-442c-b6dc-9983cf233257\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jfgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:37Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.444036 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:37Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.460473 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2df81c301857cf67d2883ea019d2d5fbd31c4c5dea2d9b5c4bfb19302b4cb03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"2026-02-17T13:25:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_fb1e1c63-e2f0-4d40-a150-555d53c42307\\\\n2026-02-17T13:25:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_fb1e1c63-e2f0-4d40-a150-555d53c42307 to /host/opt/cni/bin/\\\\n2026-02-17T13:25:50Z [verbose] multus-daemon started\\\\n2026-02-17T13:25:50Z [verbose] Readiness Indicator file check\\\\n2026-02-17T13:26:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:37Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.478023 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:37Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.491566 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:37Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.503802 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40bdd16b-bf07-4c36-80dd-66372f796f39\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dbdcf29ae01c66c5b1c3ed9c4aa7b3a6451bb49e10d52c319a2744105303386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b261eeb544e51ee2adb543542186c18759e667e1339fd724a3dce82b0391aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8a58720e24f8e51cc0749b46bb8ae7a1ecd576acaf200cce798a109e2e46fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2677496c02d25e670b94e00ddc43760e48755555d16ea7c6fb56c8bbebc19a44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2677496c02d25e670b94e00ddc43760e48755555d16ea7c6fb56c8bbebc19a44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:37Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.516017 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:37Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.525491 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.525541 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.525552 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.525576 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.525592 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:37Z","lastTransitionTime":"2026-02-17T13:26:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.528930 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:37Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.560436 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 03:58:27.406434549 +0000 UTC Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.573834 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.573914 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.573856 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.573987 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:37 crc kubenswrapper[4804]: E0217 13:26:37.574030 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:37 crc kubenswrapper[4804]: E0217 13:26:37.574147 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:37 crc kubenswrapper[4804]: E0217 13:26:37.574436 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:37 crc kubenswrapper[4804]: E0217 13:26:37.574554 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.627715 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.627765 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.627774 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.627789 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.627798 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:37Z","lastTransitionTime":"2026-02-17T13:26:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.788729 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.788816 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.788847 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.788884 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.789094 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:37Z","lastTransitionTime":"2026-02-17T13:26:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.891905 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.891947 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.891963 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.891980 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.891989 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:37Z","lastTransitionTime":"2026-02-17T13:26:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.995647 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.996014 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.996102 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.996230 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:37 crc kubenswrapper[4804]: I0217 13:26:37.996327 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:37Z","lastTransitionTime":"2026-02-17T13:26:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.099018 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.099062 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.099072 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.099090 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.099101 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:38Z","lastTransitionTime":"2026-02-17T13:26:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.200714 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.200757 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.200772 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.200789 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.200801 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:38Z","lastTransitionTime":"2026-02-17T13:26:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.302803 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.302847 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.302857 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.302875 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.302886 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:38Z","lastTransitionTime":"2026-02-17T13:26:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.406193 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.406840 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.406876 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.406902 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.406916 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:38Z","lastTransitionTime":"2026-02-17T13:26:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.510889 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.510955 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.510968 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.510995 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.511011 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:38Z","lastTransitionTime":"2026-02-17T13:26:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.561224 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 09:51:55.920514478 +0000 UTC Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.614238 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.614293 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.614306 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.614328 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.614347 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:38Z","lastTransitionTime":"2026-02-17T13:26:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.717161 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.717623 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.717728 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.717824 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.717927 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:38Z","lastTransitionTime":"2026-02-17T13:26:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.820530 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.820644 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.820658 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.820676 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.820686 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:38Z","lastTransitionTime":"2026-02-17T13:26:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.923336 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.923389 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.923400 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.923420 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:38 crc kubenswrapper[4804]: I0217 13:26:38.923435 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:38Z","lastTransitionTime":"2026-02-17T13:26:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.026521 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.026564 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.026577 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.026596 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.026606 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:39Z","lastTransitionTime":"2026-02-17T13:26:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.128953 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.129004 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.129013 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.129032 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.129045 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:39Z","lastTransitionTime":"2026-02-17T13:26:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.231503 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.231584 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.231597 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.231620 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.231637 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:39Z","lastTransitionTime":"2026-02-17T13:26:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.334092 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.334150 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.334162 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.334183 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.334210 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:39Z","lastTransitionTime":"2026-02-17T13:26:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.437535 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.437591 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.437604 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.437627 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.437641 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:39Z","lastTransitionTime":"2026-02-17T13:26:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.540488 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.540541 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.540552 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.540575 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.540592 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:39Z","lastTransitionTime":"2026-02-17T13:26:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.562942 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 04:38:24.297527462 +0000 UTC Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.573369 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.573369 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.573369 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:39 crc kubenswrapper[4804]: E0217 13:26:39.573597 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.573395 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:39 crc kubenswrapper[4804]: E0217 13:26:39.573690 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:39 crc kubenswrapper[4804]: E0217 13:26:39.573760 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:39 crc kubenswrapper[4804]: E0217 13:26:39.573823 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.643282 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.643342 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.643355 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.643378 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.643396 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:39Z","lastTransitionTime":"2026-02-17T13:26:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.746422 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.746478 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.746491 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.746521 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.746535 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:39Z","lastTransitionTime":"2026-02-17T13:26:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.849235 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.849277 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.849287 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.849306 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.849316 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:39Z","lastTransitionTime":"2026-02-17T13:26:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.952091 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.952129 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.952137 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.952153 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:39 crc kubenswrapper[4804]: I0217 13:26:39.952162 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:39Z","lastTransitionTime":"2026-02-17T13:26:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.054888 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.054943 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.054953 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.054974 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.054984 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:40Z","lastTransitionTime":"2026-02-17T13:26:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.158862 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.158926 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.158942 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.158968 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.158991 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:40Z","lastTransitionTime":"2026-02-17T13:26:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.261971 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.262415 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.262464 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.262499 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.262520 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:40Z","lastTransitionTime":"2026-02-17T13:26:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.365300 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.365374 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.365405 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.365453 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.365484 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:40Z","lastTransitionTime":"2026-02-17T13:26:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.468895 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.468957 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.468968 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.468987 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.468997 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:40Z","lastTransitionTime":"2026-02-17T13:26:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.563332 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 08:34:47.779070346 +0000 UTC Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.571817 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.571857 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.571870 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.571896 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.571907 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:40Z","lastTransitionTime":"2026-02-17T13:26:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.674726 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.674782 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.674793 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.674819 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.674833 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:40Z","lastTransitionTime":"2026-02-17T13:26:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.777550 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.777610 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.777621 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.777642 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.777654 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:40Z","lastTransitionTime":"2026-02-17T13:26:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.880925 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.880988 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.881006 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.881034 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.881053 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:40Z","lastTransitionTime":"2026-02-17T13:26:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.985263 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.985329 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.985343 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.985369 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:40 crc kubenswrapper[4804]: I0217 13:26:40.985389 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:40Z","lastTransitionTime":"2026-02-17T13:26:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.087873 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.088182 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.088261 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.088394 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.088488 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:41Z","lastTransitionTime":"2026-02-17T13:26:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.191517 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.191585 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.191597 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.191620 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.191636 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:41Z","lastTransitionTime":"2026-02-17T13:26:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.295377 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.295444 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.295457 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.295481 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.295499 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:41Z","lastTransitionTime":"2026-02-17T13:26:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.398047 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.398343 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.398445 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.398526 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.398599 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:41Z","lastTransitionTime":"2026-02-17T13:26:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.500969 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.501023 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.501033 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.501056 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.501067 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:41Z","lastTransitionTime":"2026-02-17T13:26:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.563972 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 05:44:24.886720453 +0000 UTC Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.573343 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.573414 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.573344 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:41 crc kubenswrapper[4804]: E0217 13:26:41.573515 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.573671 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:41 crc kubenswrapper[4804]: E0217 13:26:41.573706 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:41 crc kubenswrapper[4804]: E0217 13:26:41.573973 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:41 crc kubenswrapper[4804]: E0217 13:26:41.574251 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.604061 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.604122 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.604146 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.604431 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.604464 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:41Z","lastTransitionTime":"2026-02-17T13:26:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.707749 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.708153 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.708583 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.708994 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.709331 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:41Z","lastTransitionTime":"2026-02-17T13:26:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.813377 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.813459 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.813481 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.813510 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.813529 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:41Z","lastTransitionTime":"2026-02-17T13:26:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.917061 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.917161 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.917181 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.917234 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:41 crc kubenswrapper[4804]: I0217 13:26:41.917253 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:41Z","lastTransitionTime":"2026-02-17T13:26:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.020693 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.020752 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.020767 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.020792 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.020808 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:42Z","lastTransitionTime":"2026-02-17T13:26:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.124543 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.124589 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.124601 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.124622 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.124633 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:42Z","lastTransitionTime":"2026-02-17T13:26:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.227740 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.227808 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.227830 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.227856 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.227873 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:42Z","lastTransitionTime":"2026-02-17T13:26:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.331101 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.331166 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.331178 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.331215 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.331228 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:42Z","lastTransitionTime":"2026-02-17T13:26:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.434629 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.434702 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.434725 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.434779 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.434804 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:42Z","lastTransitionTime":"2026-02-17T13:26:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.537727 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.537804 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.537817 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.537842 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.537859 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:42Z","lastTransitionTime":"2026-02-17T13:26:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.564869 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 18:58:00.538066844 +0000 UTC Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.575325 4804 scope.go:117] "RemoveContainer" containerID="e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.643582 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.643621 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.643630 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.643648 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.643659 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:42Z","lastTransitionTime":"2026-02-17T13:26:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.747005 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.747067 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.747078 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.747100 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.747119 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:42Z","lastTransitionTime":"2026-02-17T13:26:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.849313 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.849357 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.849368 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.849386 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.849397 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:42Z","lastTransitionTime":"2026-02-17T13:26:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.953699 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.953779 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.953798 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.953827 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:42 crc kubenswrapper[4804]: I0217 13:26:42.953848 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:42Z","lastTransitionTime":"2026-02-17T13:26:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.057086 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.057151 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.057166 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.057215 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.057227 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:43Z","lastTransitionTime":"2026-02-17T13:26:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.159957 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.160041 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.160053 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.160075 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.160089 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:43Z","lastTransitionTime":"2026-02-17T13:26:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.263600 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.263668 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.263687 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.263717 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.263737 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:43Z","lastTransitionTime":"2026-02-17T13:26:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.284958 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v8mv6_8df4e52a-e578-472b-a6b3-418e9755714f/ovnkube-controller/2.log" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.287533 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerStarted","Data":"6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe"} Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.288072 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.309357 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:15Z\\\",\\\"message\\\":\\\" 6456 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661364 6456 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661403 6456 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661438 6456 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.671636 6456 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 13:26:15.671766 6456 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 13:26:15.671804 6456 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 13:26:15.671975 6456 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 13:26:15.671988 6456 factory.go:656] Stopping watch factory\\\\nI0217 13:26:15.690620 6456 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0217 13:26:15.690795 6456 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0217 13:26:15.690976 6456 ovnkube.go:599] Stopped ovnkube\\\\nI0217 13:26:15.691073 6456 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 13:26:15.691307 6456 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:26:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.321690 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c668758ab1ef4e6a8e2001ed6d70ec830b00746cc1454488df271b664e43024a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4c208ee4d38a7bd533fcc639d66957e66782fd5e5e8d7ec7a424b52bc4808b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.334415 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.345671 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.358437 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.366965 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.367007 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.367023 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.367043 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.367056 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:43Z","lastTransitionTime":"2026-02-17T13:26:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.370822 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77722ba-d383-442c-b6dc-9983cf233257\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jfgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.386121 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.405603 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.420474 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.436287 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.449757 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.466151 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2df81c301857cf67d2883ea019d2d5fbd31c4c5dea2d9b5c4bfb19302b4cb03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"2026-02-17T13:25:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_fb1e1c63-e2f0-4d40-a150-555d53c42307\\\\n2026-02-17T13:25:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_fb1e1c63-e2f0-4d40-a150-555d53c42307 to /host/opt/cni/bin/\\\\n2026-02-17T13:25:50Z [verbose] multus-daemon started\\\\n2026-02-17T13:25:50Z [verbose] Readiness Indicator file check\\\\n2026-02-17T13:26:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.470299 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.470338 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.470346 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.470363 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.470374 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:43Z","lastTransitionTime":"2026-02-17T13:26:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.484858 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.499733 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.512191 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40bdd16b-bf07-4c36-80dd-66372f796f39\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dbdcf29ae01c66c5b1c3ed9c4aa7b3a6451bb49e10d52c319a2744105303386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b261eeb544e51ee2adb543542186c18759e667e1339fd724a3dce82b0391aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8a58720e24f8e51cc0749b46bb8ae7a1ecd576acaf200cce798a109e2e46fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2677496c02d25e670b94e00ddc43760e48755555d16ea7c6fb56c8bbebc19a44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2677496c02d25e670b94e00ddc43760e48755555d16ea7c6fb56c8bbebc19a44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.525242 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.538720 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.565723 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 13:03:39.197563437 +0000 UTC Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.573122 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.573270 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.573128 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:43 crc kubenswrapper[4804]: E0217 13:26:43.573383 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:43 crc kubenswrapper[4804]: E0217 13:26:43.573539 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:43 crc kubenswrapper[4804]: E0217 13:26:43.573651 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.573709 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:43 crc kubenswrapper[4804]: E0217 13:26:43.573851 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.575461 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.575498 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.575508 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.575531 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.575549 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:43Z","lastTransitionTime":"2026-02-17T13:26:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.678086 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.678145 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.678157 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.678178 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.678190 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:43Z","lastTransitionTime":"2026-02-17T13:26:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.781449 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.781516 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.781537 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.781568 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.781586 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:43Z","lastTransitionTime":"2026-02-17T13:26:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.884719 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.884794 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.884814 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.884866 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.884911 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:43Z","lastTransitionTime":"2026-02-17T13:26:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.988428 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.988498 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.988516 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.988545 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:43 crc kubenswrapper[4804]: I0217 13:26:43.988565 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:43Z","lastTransitionTime":"2026-02-17T13:26:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.092674 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.092766 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.092790 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.092825 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.092849 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:44Z","lastTransitionTime":"2026-02-17T13:26:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.195471 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.195567 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.195595 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.195629 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.195652 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:44Z","lastTransitionTime":"2026-02-17T13:26:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.293799 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v8mv6_8df4e52a-e578-472b-a6b3-418e9755714f/ovnkube-controller/3.log" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.294413 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v8mv6_8df4e52a-e578-472b-a6b3-418e9755714f/ovnkube-controller/2.log" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.297624 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.297675 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.297695 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.297719 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.297741 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:44Z","lastTransitionTime":"2026-02-17T13:26:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.298321 4804 generic.go:334] "Generic (PLEG): container finished" podID="8df4e52a-e578-472b-a6b3-418e9755714f" containerID="6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe" exitCode=1 Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.298436 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerDied","Data":"6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe"} Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.298557 4804 scope.go:117] "RemoveContainer" containerID="e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.299398 4804 scope.go:117] "RemoveContainer" containerID="6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe" Feb 17 13:26:44 crc kubenswrapper[4804]: E0217 13:26:44.299654 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.323497 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:44Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.344429 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:44Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.363081 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c668758ab1ef4e6a8e2001ed6d70ec830b00746cc1454488df271b664e43024a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4c208ee4d38a7bd533fcc639d66957e66782fd5e5e8d7ec7a424b52bc4808b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:44Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.386024 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:44Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.401176 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.401239 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.401254 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.401276 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.401291 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:44Z","lastTransitionTime":"2026-02-17T13:26:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.410882 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e98735d17caddfc47ab4d64efe9df73b4150cde17af79b2f98e3e7a201f18701\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:15Z\\\",\\\"message\\\":\\\" 6456 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661364 6456 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661403 6456 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.661438 6456 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 13:26:15.671636 6456 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 13:26:15.671766 6456 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 13:26:15.671804 6456 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 13:26:15.671975 6456 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 13:26:15.671988 6456 factory.go:656] Stopping watch factory\\\\nI0217 13:26:15.690620 6456 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0217 13:26:15.690795 6456 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0217 13:26:15.690976 6456 ovnkube.go:599] Stopped ovnkube\\\\nI0217 13:26:15.691073 6456 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 13:26:15.691307 6456 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:26:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:43Z\\\",\\\"message\\\":\\\"v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF0217 13:26:43.482340 6863 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z]\\\\nI0217 13:26:43.482350 6863 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-z522z\\\\nI0217 13:26:43.482354 6863 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-4q55t\\\\nI0217 13:26:43.482359 6863 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-z522z\\\\nI0217 13:26:43.482364 6863 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-ku\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:26:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:44Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.428142 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:44Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.443571 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77722ba-d383-442c-b6dc-9983cf233257\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jfgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:44Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.459671 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:44Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.477320 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:44Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.494757 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:44Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.505277 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.505332 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.505351 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.505381 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.505400 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:44Z","lastTransitionTime":"2026-02-17T13:26:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.516450 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:44Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.540887 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:44Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.556293 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:44Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.566401 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 12:09:38.276596308 +0000 UTC Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.575932 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2df81c301857cf67d2883ea019d2d5fbd31c4c5dea2d9b5c4bfb19302b4cb03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"2026-02-17T13:25:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_fb1e1c63-e2f0-4d40-a150-555d53c42307\\\\n2026-02-17T13:25:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_fb1e1c63-e2f0-4d40-a150-555d53c42307 to /host/opt/cni/bin/\\\\n2026-02-17T13:25:50Z [verbose] multus-daemon started\\\\n2026-02-17T13:25:50Z [verbose] Readiness Indicator file check\\\\n2026-02-17T13:26:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:44Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.592057 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:44Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.606417 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:44Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.609241 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.609295 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.609305 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.609351 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.609362 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:44Z","lastTransitionTime":"2026-02-17T13:26:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.619631 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40bdd16b-bf07-4c36-80dd-66372f796f39\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dbdcf29ae01c66c5b1c3ed9c4aa7b3a6451bb49e10d52c319a2744105303386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b261eeb544e51ee2adb543542186c18759e667e1339fd724a3dce82b0391aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8a58720e24f8e51cc0749b46bb8ae7a1ecd576acaf200cce798a109e2e46fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2677496c02d25e670b94e00ddc43760e48755555d16ea7c6fb56c8bbebc19a44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2677496c02d25e670b94e00ddc43760e48755555d16ea7c6fb56c8bbebc19a44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:44Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.712617 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.712675 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.712687 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.712712 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.712725 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:44Z","lastTransitionTime":"2026-02-17T13:26:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.815990 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.816082 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.816104 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.816129 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.816146 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:44Z","lastTransitionTime":"2026-02-17T13:26:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.920162 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.920260 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.920277 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.920303 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:44 crc kubenswrapper[4804]: I0217 13:26:44.920320 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:44Z","lastTransitionTime":"2026-02-17T13:26:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.024071 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.024143 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.024162 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.024189 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.024252 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:45Z","lastTransitionTime":"2026-02-17T13:26:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.127190 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.127303 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.127321 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.127344 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.127358 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:45Z","lastTransitionTime":"2026-02-17T13:26:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.230715 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.230772 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.230789 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.230818 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.230832 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:45Z","lastTransitionTime":"2026-02-17T13:26:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.303495 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v8mv6_8df4e52a-e578-472b-a6b3-418e9755714f/ovnkube-controller/3.log" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.307910 4804 scope.go:117] "RemoveContainer" containerID="6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe" Feb 17 13:26:45 crc kubenswrapper[4804]: E0217 13:26:45.308127 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.323964 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.334273 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.334354 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.334374 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.334408 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.334428 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:45Z","lastTransitionTime":"2026-02-17T13:26:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.341731 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.364995 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:43Z\\\",\\\"message\\\":\\\"v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF0217 13:26:43.482340 6863 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z]\\\\nI0217 13:26:43.482350 6863 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-z522z\\\\nI0217 13:26:43.482354 6863 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-4q55t\\\\nI0217 13:26:43.482359 6863 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-z522z\\\\nI0217 13:26:43.482364 6863 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-ku\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:26:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.381148 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c668758ab1ef4e6a8e2001ed6d70ec830b00746cc1454488df271b664e43024a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4c208ee4d38a7bd533fcc639d66957e66782fd5e5e8d7ec7a424b52bc4808b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.399330 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.415666 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.431339 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.437277 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.437327 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.437341 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.437365 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.437381 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:45Z","lastTransitionTime":"2026-02-17T13:26:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.445156 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77722ba-d383-442c-b6dc-9983cf233257\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jfgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.467751 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.482549 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.496784 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.512336 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.523773 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.539538 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2df81c301857cf67d2883ea019d2d5fbd31c4c5dea2d9b5c4bfb19302b4cb03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"2026-02-17T13:25:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_fb1e1c63-e2f0-4d40-a150-555d53c42307\\\\n2026-02-17T13:25:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_fb1e1c63-e2f0-4d40-a150-555d53c42307 to /host/opt/cni/bin/\\\\n2026-02-17T13:25:50Z [verbose] multus-daemon started\\\\n2026-02-17T13:25:50Z [verbose] Readiness Indicator file check\\\\n2026-02-17T13:26:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.542268 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.542700 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.542834 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.543231 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.543389 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:45Z","lastTransitionTime":"2026-02-17T13:26:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.559668 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.568000 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 10:11:18.903609411 +0000 UTC Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.573580 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:45 crc kubenswrapper[4804]: E0217 13:26:45.573786 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.574299 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:45 crc kubenswrapper[4804]: E0217 13:26:45.574423 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.574663 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.574860 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:45 crc kubenswrapper[4804]: E0217 13:26:45.575102 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:45 crc kubenswrapper[4804]: E0217 13:26:45.574908 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.575454 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.593049 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40bdd16b-bf07-4c36-80dd-66372f796f39\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dbdcf29ae01c66c5b1c3ed9c4aa7b3a6451bb49e10d52c319a2744105303386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b261eeb544e51ee2adb543542186c18759e667e1339fd724a3dce82b0391aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8a58720e24f8e51cc0749b46bb8ae7a1ecd576acaf200cce798a109e2e46fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2677496c02d25e670b94e00ddc43760e48755555d16ea7c6fb56c8bbebc19a44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2677496c02d25e670b94e00ddc43760e48755555d16ea7c6fb56c8bbebc19a44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.646491 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.646545 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.646567 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.646594 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.646613 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:45Z","lastTransitionTime":"2026-02-17T13:26:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.708957 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.709003 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.709021 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.709047 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.709065 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:45Z","lastTransitionTime":"2026-02-17T13:26:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:45 crc kubenswrapper[4804]: E0217 13:26:45.729599 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.736418 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.736733 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.736857 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.736994 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.737374 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:45Z","lastTransitionTime":"2026-02-17T13:26:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:45 crc kubenswrapper[4804]: E0217 13:26:45.762312 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.769393 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.769438 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.769452 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.769477 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.769492 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:45Z","lastTransitionTime":"2026-02-17T13:26:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:45 crc kubenswrapper[4804]: E0217 13:26:45.788451 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.793720 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.793756 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.793768 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.793791 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.793805 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:45Z","lastTransitionTime":"2026-02-17T13:26:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:45 crc kubenswrapper[4804]: E0217 13:26:45.805791 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.809716 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.809751 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.809760 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.809778 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.809789 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:45Z","lastTransitionTime":"2026-02-17T13:26:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:45 crc kubenswrapper[4804]: E0217 13:26:45.826279 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:45Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:45 crc kubenswrapper[4804]: E0217 13:26:45.826514 4804 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.828949 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.829003 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.829022 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.829047 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.829063 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:45Z","lastTransitionTime":"2026-02-17T13:26:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.931687 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.931752 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.931769 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.931798 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:45 crc kubenswrapper[4804]: I0217 13:26:45.931816 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:45Z","lastTransitionTime":"2026-02-17T13:26:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.035832 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.036419 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.036592 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.036745 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.036871 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:46Z","lastTransitionTime":"2026-02-17T13:26:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.140493 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.140560 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.140577 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.140605 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.140623 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:46Z","lastTransitionTime":"2026-02-17T13:26:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.245263 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.245315 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.245327 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.245353 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.245371 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:46Z","lastTransitionTime":"2026-02-17T13:26:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.347825 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.347904 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.347918 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.347938 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.347949 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:46Z","lastTransitionTime":"2026-02-17T13:26:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.451186 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.451247 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.451259 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.451281 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.451294 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:46Z","lastTransitionTime":"2026-02-17T13:26:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.554498 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.554561 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.554581 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.554609 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.554626 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:46Z","lastTransitionTime":"2026-02-17T13:26:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.569295 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 01:51:04.871056412 +0000 UTC Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.591537 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:46Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.605385 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:46Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.624021 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:46Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.642287 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77722ba-d383-442c-b6dc-9983cf233257\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jfgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:46Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.657479 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.657519 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.657530 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.657554 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.657570 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:46Z","lastTransitionTime":"2026-02-17T13:26:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.658411 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:46Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.674158 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40bdd16b-bf07-4c36-80dd-66372f796f39\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dbdcf29ae01c66c5b1c3ed9c4aa7b3a6451bb49e10d52c319a2744105303386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b261eeb544e51ee2adb543542186c18759e667e1339fd724a3dce82b0391aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8a58720e24f8e51cc0749b46bb8ae7a1ecd576acaf200cce798a109e2e46fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2677496c02d25e670b94e00ddc43760e48755555d16ea7c6fb56c8bbebc19a44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2677496c02d25e670b94e00ddc43760e48755555d16ea7c6fb56c8bbebc19a44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:46Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.691240 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:46Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.712686 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:46Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.727550 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:46Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.740922 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2df81c301857cf67d2883ea019d2d5fbd31c4c5dea2d9b5c4bfb19302b4cb03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"2026-02-17T13:25:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_fb1e1c63-e2f0-4d40-a150-555d53c42307\\\\n2026-02-17T13:25:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_fb1e1c63-e2f0-4d40-a150-555d53c42307 to /host/opt/cni/bin/\\\\n2026-02-17T13:25:50Z [verbose] multus-daemon started\\\\n2026-02-17T13:25:50Z [verbose] Readiness Indicator file check\\\\n2026-02-17T13:26:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:46Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.754554 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:46Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.759471 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.759504 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.759513 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.759531 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.759540 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:46Z","lastTransitionTime":"2026-02-17T13:26:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.768376 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:46Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.785059 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:46Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.804624 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:46Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.815653 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:46Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.835552 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:43Z\\\",\\\"message\\\":\\\"v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF0217 13:26:43.482340 6863 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z]\\\\nI0217 13:26:43.482350 6863 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-z522z\\\\nI0217 13:26:43.482354 6863 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-4q55t\\\\nI0217 13:26:43.482359 6863 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-z522z\\\\nI0217 13:26:43.482364 6863 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-ku\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:26:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:46Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.850783 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c668758ab1ef4e6a8e2001ed6d70ec830b00746cc1454488df271b664e43024a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4c208ee4d38a7bd533fcc639d66957e66782fd5e5e8d7ec7a424b52bc4808b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:46Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.861853 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.861907 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.861924 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.861950 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.861967 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:46Z","lastTransitionTime":"2026-02-17T13:26:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.964484 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.964804 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.964900 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.964973 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:46 crc kubenswrapper[4804]: I0217 13:26:46.965120 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:46Z","lastTransitionTime":"2026-02-17T13:26:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.068572 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.068638 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.068648 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.068669 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.068680 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:47Z","lastTransitionTime":"2026-02-17T13:26:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.172799 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.173283 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.173469 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.173675 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.173812 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:47Z","lastTransitionTime":"2026-02-17T13:26:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.277150 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.277226 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.277238 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.277256 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.277267 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:47Z","lastTransitionTime":"2026-02-17T13:26:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.380544 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.380635 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.380661 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.380697 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.380725 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:47Z","lastTransitionTime":"2026-02-17T13:26:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.483262 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.483308 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.483320 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.483340 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.483352 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:47Z","lastTransitionTime":"2026-02-17T13:26:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.570545 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 15:22:02.478099346 +0000 UTC Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.573117 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.573352 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:47 crc kubenswrapper[4804]: E0217 13:26:47.573342 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.573451 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:47 crc kubenswrapper[4804]: E0217 13:26:47.573583 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.573639 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:47 crc kubenswrapper[4804]: E0217 13:26:47.573739 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:47 crc kubenswrapper[4804]: E0217 13:26:47.573855 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.587282 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.587348 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.587373 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.587405 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.587427 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:47Z","lastTransitionTime":"2026-02-17T13:26:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.691366 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.691418 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.691437 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.691461 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.691480 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:47Z","lastTransitionTime":"2026-02-17T13:26:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.794938 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.794997 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.795015 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.795045 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.795066 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:47Z","lastTransitionTime":"2026-02-17T13:26:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.898327 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.898368 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.898377 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.898396 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:47 crc kubenswrapper[4804]: I0217 13:26:47.898407 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:47Z","lastTransitionTime":"2026-02-17T13:26:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.001969 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.002025 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.002036 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.002064 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.002074 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:48Z","lastTransitionTime":"2026-02-17T13:26:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.105227 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.105277 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.105290 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.105311 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.105325 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:48Z","lastTransitionTime":"2026-02-17T13:26:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.208812 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.208885 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.208902 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.208931 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.208949 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:48Z","lastTransitionTime":"2026-02-17T13:26:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.311939 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.312032 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.312056 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.312089 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.312114 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:48Z","lastTransitionTime":"2026-02-17T13:26:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.414297 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.414349 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.414366 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.414388 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.414402 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:48Z","lastTransitionTime":"2026-02-17T13:26:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.518281 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.518419 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.518451 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.518486 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.518509 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:48Z","lastTransitionTime":"2026-02-17T13:26:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.571226 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 16:53:49.741718618 +0000 UTC Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.621362 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.621403 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.621429 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.621448 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.621460 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:48Z","lastTransitionTime":"2026-02-17T13:26:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.729013 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.729097 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.729124 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.729170 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.729193 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:48Z","lastTransitionTime":"2026-02-17T13:26:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.832919 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.832989 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.833013 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.833044 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.833067 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:48Z","lastTransitionTime":"2026-02-17T13:26:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.936128 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.936216 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.936234 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.936263 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:48 crc kubenswrapper[4804]: I0217 13:26:48.936282 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:48Z","lastTransitionTime":"2026-02-17T13:26:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.038939 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.038996 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.039013 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.039037 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.039055 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:49Z","lastTransitionTime":"2026-02-17T13:26:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.141769 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.141836 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.141853 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.141872 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.141884 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:49Z","lastTransitionTime":"2026-02-17T13:26:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.245056 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.245160 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.245183 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.245275 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.245296 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:49Z","lastTransitionTime":"2026-02-17T13:26:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.347670 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.347786 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.347810 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.347840 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.347860 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:49Z","lastTransitionTime":"2026-02-17T13:26:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.450566 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.450611 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.450622 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.450640 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.450649 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:49Z","lastTransitionTime":"2026-02-17T13:26:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.554552 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.554619 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.554638 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.554667 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.554685 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:49Z","lastTransitionTime":"2026-02-17T13:26:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.572418 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 05:29:46.294059487 +0000 UTC Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.573819 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.573904 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.573819 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:49 crc kubenswrapper[4804]: E0217 13:26:49.574016 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.574082 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:49 crc kubenswrapper[4804]: E0217 13:26:49.574267 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:49 crc kubenswrapper[4804]: E0217 13:26:49.574455 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:49 crc kubenswrapper[4804]: E0217 13:26:49.574619 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.658305 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.658430 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.658453 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.658751 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.658832 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:49Z","lastTransitionTime":"2026-02-17T13:26:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.762042 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.762123 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.762145 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.762183 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.762246 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:49Z","lastTransitionTime":"2026-02-17T13:26:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.866072 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.866179 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.866245 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.866293 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.866319 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:49Z","lastTransitionTime":"2026-02-17T13:26:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.969858 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.969978 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.970007 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.970047 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:49 crc kubenswrapper[4804]: I0217 13:26:49.970074 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:49Z","lastTransitionTime":"2026-02-17T13:26:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.073694 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.073782 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.073833 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.073875 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.073899 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:50Z","lastTransitionTime":"2026-02-17T13:26:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.176768 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.176873 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.176913 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.176951 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.176976 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:50Z","lastTransitionTime":"2026-02-17T13:26:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.280433 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.280498 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.280513 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.280533 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.280546 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:50Z","lastTransitionTime":"2026-02-17T13:26:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.383313 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.383388 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.383431 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.383469 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.383490 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:50Z","lastTransitionTime":"2026-02-17T13:26:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.487478 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.487561 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.487595 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.487632 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.487701 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:50Z","lastTransitionTime":"2026-02-17T13:26:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.573425 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 00:47:07.574485598 +0000 UTC Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.591074 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.591141 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.591159 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.591186 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.591230 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:50Z","lastTransitionTime":"2026-02-17T13:26:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.695095 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.695154 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.695167 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.695187 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.695221 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:50Z","lastTransitionTime":"2026-02-17T13:26:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.797173 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.797240 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.797251 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.797269 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.797278 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:50Z","lastTransitionTime":"2026-02-17T13:26:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.900422 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.900465 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.900478 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.900498 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:50 crc kubenswrapper[4804]: I0217 13:26:50.900515 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:50Z","lastTransitionTime":"2026-02-17T13:26:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.004494 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.004532 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.004540 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.004557 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.004565 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:51Z","lastTransitionTime":"2026-02-17T13:26:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.107806 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.107860 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.107875 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.107897 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.107916 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:51Z","lastTransitionTime":"2026-02-17T13:26:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.211038 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.211105 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.211118 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.211137 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.211154 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:51Z","lastTransitionTime":"2026-02-17T13:26:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.313381 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.313436 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.313452 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.313477 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.313494 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:51Z","lastTransitionTime":"2026-02-17T13:26:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.349733 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.349841 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.349878 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.349912 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.349938 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:51 crc kubenswrapper[4804]: E0217 13:26:51.350083 4804 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 13:26:51 crc kubenswrapper[4804]: E0217 13:26:51.350143 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.350124632 +0000 UTC m=+149.461543979 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 13:26:51 crc kubenswrapper[4804]: E0217 13:26:51.350140 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 13:26:51 crc kubenswrapper[4804]: E0217 13:26:51.350244 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 13:26:51 crc kubenswrapper[4804]: E0217 13:26:51.350262 4804 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:26:51 crc kubenswrapper[4804]: E0217 13:26:51.350299 4804 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 13:26:51 crc kubenswrapper[4804]: E0217 13:26:51.350173 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 13:26:51 crc kubenswrapper[4804]: E0217 13:26:51.350379 4804 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 13:26:51 crc kubenswrapper[4804]: E0217 13:26:51.350334 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.350312958 +0000 UTC m=+149.461732305 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:26:51 crc kubenswrapper[4804]: E0217 13:26:51.350407 4804 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:26:51 crc kubenswrapper[4804]: E0217 13:26:51.350446 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.350412632 +0000 UTC m=+149.461832119 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 13:26:51 crc kubenswrapper[4804]: E0217 13:26:51.350503 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.350476884 +0000 UTC m=+149.461896351 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 13:26:51 crc kubenswrapper[4804]: E0217 13:26:51.350629 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.350607057 +0000 UTC m=+149.462026434 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.422741 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.422809 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.422831 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.422862 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.422891 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:51Z","lastTransitionTime":"2026-02-17T13:26:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.525952 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.526445 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.526475 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.526513 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.526538 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:51Z","lastTransitionTime":"2026-02-17T13:26:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.573030 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.573179 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.573071 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.573047 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:51 crc kubenswrapper[4804]: E0217 13:26:51.573341 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:51 crc kubenswrapper[4804]: E0217 13:26:51.573411 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:51 crc kubenswrapper[4804]: E0217 13:26:51.573530 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.573599 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 09:29:44.719726923 +0000 UTC Feb 17 13:26:51 crc kubenswrapper[4804]: E0217 13:26:51.573870 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.630340 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.630443 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.630457 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.630481 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.630501 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:51Z","lastTransitionTime":"2026-02-17T13:26:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.734512 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.734576 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.734593 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.734623 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.734638 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:51Z","lastTransitionTime":"2026-02-17T13:26:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.837116 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.837238 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.837264 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.837296 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.837320 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:51Z","lastTransitionTime":"2026-02-17T13:26:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.940071 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.940150 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.940175 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.940236 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:51 crc kubenswrapper[4804]: I0217 13:26:51.940266 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:51Z","lastTransitionTime":"2026-02-17T13:26:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.043884 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.043959 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.043986 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.044019 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.044044 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:52Z","lastTransitionTime":"2026-02-17T13:26:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.147273 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.147361 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.147386 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.147422 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.147445 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:52Z","lastTransitionTime":"2026-02-17T13:26:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.251488 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.251567 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.251592 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.251625 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.251651 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:52Z","lastTransitionTime":"2026-02-17T13:26:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.355397 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.355473 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.355539 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.355571 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.355595 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:52Z","lastTransitionTime":"2026-02-17T13:26:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.459599 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.459717 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.459749 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.459784 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.459809 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:52Z","lastTransitionTime":"2026-02-17T13:26:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.563587 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.563706 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.563784 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.563823 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.563845 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:52Z","lastTransitionTime":"2026-02-17T13:26:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.573722 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 02:07:19.265326437 +0000 UTC Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.591621 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.666861 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.666927 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.666949 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.666980 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.667002 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:52Z","lastTransitionTime":"2026-02-17T13:26:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.769864 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.769923 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.769939 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.769958 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.769969 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:52Z","lastTransitionTime":"2026-02-17T13:26:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.873117 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.873180 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.873191 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.873228 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.873240 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:52Z","lastTransitionTime":"2026-02-17T13:26:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.976346 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.976410 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.976438 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.976474 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:52 crc kubenswrapper[4804]: I0217 13:26:52.976496 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:52Z","lastTransitionTime":"2026-02-17T13:26:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.079923 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.079972 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.079987 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.080006 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.080018 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:53Z","lastTransitionTime":"2026-02-17T13:26:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.183569 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.183650 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.183671 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.183746 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.183769 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:53Z","lastTransitionTime":"2026-02-17T13:26:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.287679 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.287781 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.287799 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.287826 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.287842 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:53Z","lastTransitionTime":"2026-02-17T13:26:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.398147 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.398190 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.398219 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.398238 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.398249 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:53Z","lastTransitionTime":"2026-02-17T13:26:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.501707 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.501756 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.501767 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.501784 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.501799 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:53Z","lastTransitionTime":"2026-02-17T13:26:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.573615 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.573686 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:53 crc kubenswrapper[4804]: E0217 13:26:53.573791 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.573717 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:53 crc kubenswrapper[4804]: E0217 13:26:53.573869 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.573976 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:53 crc kubenswrapper[4804]: E0217 13:26:53.574092 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.573984 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 09:34:25.807309553 +0000 UTC Feb 17 13:26:53 crc kubenswrapper[4804]: E0217 13:26:53.574246 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.605622 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.605699 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.605716 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.605744 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.605765 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:53Z","lastTransitionTime":"2026-02-17T13:26:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.708177 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.708235 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.708244 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.708260 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.708270 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:53Z","lastTransitionTime":"2026-02-17T13:26:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.812106 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.812172 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.812245 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.812283 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.812311 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:53Z","lastTransitionTime":"2026-02-17T13:26:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.915158 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.915234 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.915248 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.915269 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:53 crc kubenswrapper[4804]: I0217 13:26:53.915281 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:53Z","lastTransitionTime":"2026-02-17T13:26:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.018734 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.018837 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.018873 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.018912 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.018935 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:54Z","lastTransitionTime":"2026-02-17T13:26:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.122648 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.123138 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.123360 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.123414 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.123435 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:54Z","lastTransitionTime":"2026-02-17T13:26:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.226760 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.226835 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.226851 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.226879 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.226899 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:54Z","lastTransitionTime":"2026-02-17T13:26:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.329792 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.329869 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.329886 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.329909 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.329922 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:54Z","lastTransitionTime":"2026-02-17T13:26:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.433389 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.433458 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.433469 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.433488 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.433502 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:54Z","lastTransitionTime":"2026-02-17T13:26:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.537150 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.537242 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.537253 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.537271 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.537281 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:54Z","lastTransitionTime":"2026-02-17T13:26:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.574875 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 07:53:10.983068433 +0000 UTC Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.640193 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.640302 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.640326 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.640366 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.640408 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:54Z","lastTransitionTime":"2026-02-17T13:26:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.744176 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.744279 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.744299 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.744330 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.744348 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:54Z","lastTransitionTime":"2026-02-17T13:26:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.847404 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.847457 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.847475 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.847500 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.847519 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:54Z","lastTransitionTime":"2026-02-17T13:26:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.950040 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.950080 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.950089 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.950104 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:54 crc kubenswrapper[4804]: I0217 13:26:54.950113 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:54Z","lastTransitionTime":"2026-02-17T13:26:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.053688 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.053805 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.053829 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.054564 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.054645 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:55Z","lastTransitionTime":"2026-02-17T13:26:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.158511 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.158582 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.158599 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.158626 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.158645 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:55Z","lastTransitionTime":"2026-02-17T13:26:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.261971 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.262042 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.262053 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.262073 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.262088 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:55Z","lastTransitionTime":"2026-02-17T13:26:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.364790 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.364851 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.364863 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.364880 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.364908 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:55Z","lastTransitionTime":"2026-02-17T13:26:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.467921 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.467973 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.467989 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.468008 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.468023 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:55Z","lastTransitionTime":"2026-02-17T13:26:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.571104 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.571514 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.571603 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.571686 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.571764 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:55Z","lastTransitionTime":"2026-02-17T13:26:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.573462 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.573561 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.573466 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.573466 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:55 crc kubenswrapper[4804]: E0217 13:26:55.573892 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:55 crc kubenswrapper[4804]: E0217 13:26:55.574010 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:55 crc kubenswrapper[4804]: E0217 13:26:55.574239 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:55 crc kubenswrapper[4804]: E0217 13:26:55.574396 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.575634 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 07:23:18.463838243 +0000 UTC Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.674699 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.675005 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.675084 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.675269 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.675374 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:55Z","lastTransitionTime":"2026-02-17T13:26:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.778927 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.779321 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.779432 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.779516 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.779599 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:55Z","lastTransitionTime":"2026-02-17T13:26:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.882295 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.882635 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.882724 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.882808 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.882890 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:55Z","lastTransitionTime":"2026-02-17T13:26:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.916360 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.916660 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.916782 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.916877 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.916960 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:55Z","lastTransitionTime":"2026-02-17T13:26:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:55 crc kubenswrapper[4804]: E0217 13:26:55.938669 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:55Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.943705 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.943864 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.943959 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.944053 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.944137 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:55Z","lastTransitionTime":"2026-02-17T13:26:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:55 crc kubenswrapper[4804]: E0217 13:26:55.964934 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:55Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.969772 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.970092 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.970214 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.970315 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.970433 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:55Z","lastTransitionTime":"2026-02-17T13:26:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:55 crc kubenswrapper[4804]: E0217 13:26:55.988355 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:55Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.993455 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.993595 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.993876 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.993994 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:55 crc kubenswrapper[4804]: I0217 13:26:55.994092 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:55Z","lastTransitionTime":"2026-02-17T13:26:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:56 crc kubenswrapper[4804]: E0217 13:26:56.011787 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.017281 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.017331 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.017352 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.017381 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.017403 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:56Z","lastTransitionTime":"2026-02-17T13:26:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:56 crc kubenswrapper[4804]: E0217 13:26:56.036262 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:26:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bf842257-95c9-4f3c-a5d3-b668d3623b7b\\\",\\\"systemUUID\\\":\\\"2305fbdc-66f1-473f-924a-04d713bb59e5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:56 crc kubenswrapper[4804]: E0217 13:26:56.036602 4804 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.039404 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.039461 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.039474 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.039501 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.039515 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:56Z","lastTransitionTime":"2026-02-17T13:26:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.613700 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 01:10:26.371310323 +0000 UTC Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.631342 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.631405 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.631420 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.631441 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.631454 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:56Z","lastTransitionTime":"2026-02-17T13:26:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.637927 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4q55t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"526d243d-907b-44f6-a601-de8e86515a3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb551e82660cac059fb23f76cb19628abc3275428306dd717b2547bf815856b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b575b6c098dd7791338505072dab5fb2f6f285ab667faa0ab626edf02e1670f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471554f306c2b0712b930704a81192bb484d7b03e2456e91b7264d5c664de144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0602a151d51de35eb9ac57a8fe64c33589639722bbfb68581da06b314c8f29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33674ee05b6c3f13db8c297e0c3270eef9b023f9ba8b0111f3843c44b648da02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e427589f2fb3530874ac6157a6f7a18f58314e90bec73cddd02686fc354e2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e01a939b8121004a4d9c91faf880f40462368379fc7325db0d6cf564f46cb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tnx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4q55t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.653953 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb58f0da-7648-4e6c-bdb0-a6f3f13a18f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49ed163aeb74fd88b31698d97dbd2e0d2caf1078f5d3e4ea8c7c50dc3ade34c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51e74667f42eb82c453d44bc776048cf1c17633acbac42fd388cd1fa0b6f6696\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c459b0745ff24f88f33036314a8f6ab413210186a3a30c2236f888191d831930\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.667685 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40bdd16b-bf07-4c36-80dd-66372f796f39\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dbdcf29ae01c66c5b1c3ed9c4aa7b3a6451bb49e10d52c319a2744105303386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b261eeb544e51ee2adb543542186c18759e667e1339fd724a3dce82b0391aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8a58720e24f8e51cc0749b46bb8ae7a1ecd576acaf200cce798a109e2e46fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2677496c02d25e670b94e00ddc43760e48755555d16ea7c6fb56c8bbebc19a44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2677496c02d25e670b94e00ddc43760e48755555d16ea7c6fb56c8bbebc19a44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.683353 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.698562 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1708d0d80f88162bacbfdfa7b8b9904857e939fb80832425fb09287b3e9e844b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.711503 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4fbbv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c6a0bd5-0682-4f6a-bca1-a7084e6c30bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d6af135b49f9e756e47ddcce266ee00c291f9209fc8d63ba8deb9c5a4b344a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwzf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4fbbv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.728214 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kclvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42eec48d-c990-43e6-8348-d9f78997ec3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2df81c301857cf67d2883ea019d2d5fbd31c4c5dea2d9b5c4bfb19302b4cb03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:35Z\\\",\\\"message\\\":\\\"2026-02-17T13:25:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_fb1e1c63-e2f0-4d40-a150-555d53c42307\\\\n2026-02-17T13:25:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_fb1e1c63-e2f0-4d40-a150-555d53c42307 to /host/opt/cni/bin/\\\\n2026-02-17T13:25:50Z [verbose] multus-daemon started\\\\n2026-02-17T13:25:50Z [verbose] Readiness Indicator file check\\\\n2026-02-17T13:26:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvgkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kclvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.734028 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.734078 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.734091 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.734112 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.734127 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:56Z","lastTransitionTime":"2026-02-17T13:26:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.741591 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.755992 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.767310 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec20591e-8008-4af6-83b7-51eb41217805\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c16d90fa3ea6207e30c8c7d82c6d77586b791fbe1a490094e34f01371a61d89b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31ae10eb113c4f6e69ec71e2ef5e301093278d304f6fd564c0bfaa68aae3df53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31ae10eb113c4f6e69ec71e2ef5e301093278d304f6fd564c0bfaa68aae3df53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.781383 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40958f8a96d5e847d0ad60966b90ebb3408d601b9d9b7b20ecaf59d5ae30189a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.800174 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8df4e52a-e578-472b-a6b3-418e9755714f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T13:26:43Z\\\",\\\"message\\\":\\\"v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF0217 13:26:43.482340 6863 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:43Z is after 2025-08-24T17:21:41Z]\\\\nI0217 13:26:43.482350 6863 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-z522z\\\\nI0217 13:26:43.482354 6863 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-4q55t\\\\nI0217 13:26:43.482359 6863 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-z522z\\\\nI0217 13:26:43.482364 6863 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-ku\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:26:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75nhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v8mv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.814820 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be1ee3c4-2152-421a-b39c-c1455968a17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c668758ab1ef4e6a8e2001ed6d70ec830b00746cc1454488df271b664e43024a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4c208ee4d38a7bd533fcc639d66957e66782fd5e5e8d7ec7a424b52bc4808b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:26:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vd9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ln7fh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.828178 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5cc0810-c040-4cf1-a739-fcd9be2be222\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 13:25:41.344218 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 13:25:41.346021 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3416295095/tls.crt::/tmp/serving-cert-3416295095/tls.key\\\\\\\"\\\\nI0217 13:25:47.445097 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 13:25:47.451450 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 13:25:47.451481 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 13:25:47.451508 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 13:25:47.451516 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 13:25:47.461779 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 13:25:47.461827 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 13:25:47.461850 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 13:25:47.461859 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 13:25:47.461868 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 13:25:47.461875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 13:25:47.462306 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 13:25:47.463495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T13:25:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T13:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.836507 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.836554 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.836568 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.836585 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.836597 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:56Z","lastTransitionTime":"2026-02-17T13:26:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.840862 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94f082525d8665bda7e1df1846d79f53b6be7a81713fcb42f4193f75f45896bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d440440064baf0cbc5f8645c39f033bd96a2b29305c7c3a139d8bfe7983861b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.851846 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6992e22f-b963-46fc-ac41-4ca9938dda85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049b54c8b34dab788ac7987828a72004f987d5a9ac81cce54690f3c54543693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvrlj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zb7c5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.863988 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-z522z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d0b53df-b6de-4c33-a429-560638368e6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:25:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec2ea7c4966220e69d5784d2a7df5ce8c4248b116c6778bf34d515d651cc4bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:25:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-z522z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.877365 4804 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77722ba-d383-442c-b6dc-9983cf233257\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T13:26:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T13:26:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jfgm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T13:26:56Z is after 2025-08-24T17:21:41Z" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.939988 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.940056 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.940074 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.940100 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:56 crc kubenswrapper[4804]: I0217 13:26:56.940121 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:56Z","lastTransitionTime":"2026-02-17T13:26:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.042694 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.042742 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.042754 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.042777 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.042790 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:57Z","lastTransitionTime":"2026-02-17T13:26:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.145759 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.145837 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.145849 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.145872 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.145886 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:57Z","lastTransitionTime":"2026-02-17T13:26:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.248980 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.249042 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.249053 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.249075 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.249089 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:57Z","lastTransitionTime":"2026-02-17T13:26:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.351620 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.351673 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.351691 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.351709 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.351721 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:57Z","lastTransitionTime":"2026-02-17T13:26:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.454983 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.455051 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.455068 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.455097 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.455115 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:57Z","lastTransitionTime":"2026-02-17T13:26:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.559023 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.559077 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.559086 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.559105 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.559237 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:57Z","lastTransitionTime":"2026-02-17T13:26:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.573698 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.573745 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.573691 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:57 crc kubenswrapper[4804]: E0217 13:26:57.573840 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.573943 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:57 crc kubenswrapper[4804]: E0217 13:26:57.574130 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:57 crc kubenswrapper[4804]: E0217 13:26:57.574451 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:57 crc kubenswrapper[4804]: E0217 13:26:57.574957 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.575170 4804 scope.go:117] "RemoveContainer" containerID="6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe" Feb 17 13:26:57 crc kubenswrapper[4804]: E0217 13:26:57.575359 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.615307 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 03:59:15.388177514 +0000 UTC Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.662306 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.662722 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.662904 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.663060 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.663279 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:57Z","lastTransitionTime":"2026-02-17T13:26:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.767090 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.767669 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.767841 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.768070 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.768257 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:57Z","lastTransitionTime":"2026-02-17T13:26:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.871461 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.871885 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.872086 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.872276 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.872400 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:57Z","lastTransitionTime":"2026-02-17T13:26:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.975520 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.975926 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.976266 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.976664 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:57 crc kubenswrapper[4804]: I0217 13:26:57.977030 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:57Z","lastTransitionTime":"2026-02-17T13:26:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.080006 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.080098 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.080112 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.080132 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.080146 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:58Z","lastTransitionTime":"2026-02-17T13:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.183583 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.183672 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.183821 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.183857 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.183876 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:58Z","lastTransitionTime":"2026-02-17T13:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.286836 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.286891 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.286909 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.286937 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.286955 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:58Z","lastTransitionTime":"2026-02-17T13:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.390700 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.390783 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.390806 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.390838 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.390856 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:58Z","lastTransitionTime":"2026-02-17T13:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.494754 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.494815 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.494825 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.494844 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.494858 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:58Z","lastTransitionTime":"2026-02-17T13:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.598396 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.598455 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.598471 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.598493 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.598506 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:58Z","lastTransitionTime":"2026-02-17T13:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.615660 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 01:30:09.590195273 +0000 UTC Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.701009 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.701116 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.701131 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.701152 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.701165 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:58Z","lastTransitionTime":"2026-02-17T13:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.804372 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.804431 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.804442 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.804465 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.804485 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:58Z","lastTransitionTime":"2026-02-17T13:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.907960 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.908009 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.908020 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.908042 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:58 crc kubenswrapper[4804]: I0217 13:26:58.908057 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:58Z","lastTransitionTime":"2026-02-17T13:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.012158 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.012246 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.012261 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.012281 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.012293 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:59Z","lastTransitionTime":"2026-02-17T13:26:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.116611 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.116659 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.116671 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.116691 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.116704 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:59Z","lastTransitionTime":"2026-02-17T13:26:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.219642 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.219713 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.219727 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.219751 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.219766 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:59Z","lastTransitionTime":"2026-02-17T13:26:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.323100 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.323167 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.323190 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.323248 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.323272 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:59Z","lastTransitionTime":"2026-02-17T13:26:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.426436 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.426515 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.426536 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.426566 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.426587 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:59Z","lastTransitionTime":"2026-02-17T13:26:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.529720 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.529786 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.529807 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.529835 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.529855 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:59Z","lastTransitionTime":"2026-02-17T13:26:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.573831 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.573912 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.573916 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.573848 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:26:59 crc kubenswrapper[4804]: E0217 13:26:59.574018 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:26:59 crc kubenswrapper[4804]: E0217 13:26:59.574248 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:26:59 crc kubenswrapper[4804]: E0217 13:26:59.574365 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:26:59 crc kubenswrapper[4804]: E0217 13:26:59.574515 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.616528 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 09:36:49.049420349 +0000 UTC Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.633258 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.633351 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.633379 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.633410 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.633432 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:59Z","lastTransitionTime":"2026-02-17T13:26:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.736812 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.736895 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.736915 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.736945 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.736964 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:59Z","lastTransitionTime":"2026-02-17T13:26:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.840358 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.840437 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.840458 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.840484 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.840503 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:59Z","lastTransitionTime":"2026-02-17T13:26:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.944320 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.944398 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.944421 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.944463 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:26:59 crc kubenswrapper[4804]: I0217 13:26:59.944487 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:26:59Z","lastTransitionTime":"2026-02-17T13:26:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.047480 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.047534 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.047552 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.047578 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.047625 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:00Z","lastTransitionTime":"2026-02-17T13:27:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.151494 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.151579 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.151602 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.151629 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.151647 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:00Z","lastTransitionTime":"2026-02-17T13:27:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.254700 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.254771 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.254783 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.254805 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.254820 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:00Z","lastTransitionTime":"2026-02-17T13:27:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.357980 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.358049 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.358059 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.358095 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.358107 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:00Z","lastTransitionTime":"2026-02-17T13:27:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.462300 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.462689 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.462776 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.462867 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.462931 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:00Z","lastTransitionTime":"2026-02-17T13:27:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.566665 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.566721 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.566731 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.566749 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.566760 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:00Z","lastTransitionTime":"2026-02-17T13:27:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.616928 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 19:06:51.72946099 +0000 UTC Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.669892 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.669949 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.669968 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.669991 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.670003 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:00Z","lastTransitionTime":"2026-02-17T13:27:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.773256 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.773327 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.773346 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.773374 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.773395 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:00Z","lastTransitionTime":"2026-02-17T13:27:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.876676 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.876781 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.876807 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.876839 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.876863 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:00Z","lastTransitionTime":"2026-02-17T13:27:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.980481 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.980559 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.980573 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.980616 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:00 crc kubenswrapper[4804]: I0217 13:27:00.980629 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:00Z","lastTransitionTime":"2026-02-17T13:27:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.083772 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.083861 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.083884 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.083907 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.083943 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:01Z","lastTransitionTime":"2026-02-17T13:27:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.187429 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.187493 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.187512 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.187538 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.187556 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:01Z","lastTransitionTime":"2026-02-17T13:27:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.290810 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.290874 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.290892 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.290920 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.290938 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:01Z","lastTransitionTime":"2026-02-17T13:27:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.394150 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.394284 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.394314 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.394346 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.394376 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:01Z","lastTransitionTime":"2026-02-17T13:27:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.498261 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.498686 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.498899 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.499153 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.499409 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:01Z","lastTransitionTime":"2026-02-17T13:27:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.573800 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.573910 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.573921 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:01 crc kubenswrapper[4804]: E0217 13:27:01.574017 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:01 crc kubenswrapper[4804]: E0217 13:27:01.574168 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.574261 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:01 crc kubenswrapper[4804]: E0217 13:27:01.574428 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:01 crc kubenswrapper[4804]: E0217 13:27:01.574633 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.602921 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.602976 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.602994 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.603021 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.603041 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:01Z","lastTransitionTime":"2026-02-17T13:27:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.617559 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 07:59:26.942004683 +0000 UTC Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.706751 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.706807 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.706829 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.706863 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.706885 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:01Z","lastTransitionTime":"2026-02-17T13:27:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.810438 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.810499 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.810518 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.810544 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.810562 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:01Z","lastTransitionTime":"2026-02-17T13:27:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.914176 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.914292 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.914311 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.914337 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:01 crc kubenswrapper[4804]: I0217 13:27:01.914356 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:01Z","lastTransitionTime":"2026-02-17T13:27:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.017978 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.018044 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.018062 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.018089 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.018109 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:02Z","lastTransitionTime":"2026-02-17T13:27:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.122144 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.122262 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.122288 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.122322 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.122348 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:02Z","lastTransitionTime":"2026-02-17T13:27:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.225796 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.225883 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.225907 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.225939 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.225960 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:02Z","lastTransitionTime":"2026-02-17T13:27:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.330109 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.330280 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.330305 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.330331 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.330350 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:02Z","lastTransitionTime":"2026-02-17T13:27:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.435342 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.435398 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.435415 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.435441 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.435454 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:02Z","lastTransitionTime":"2026-02-17T13:27:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.539043 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.539123 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.539149 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.539185 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.539263 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:02Z","lastTransitionTime":"2026-02-17T13:27:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.618407 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 00:09:15.919003072 +0000 UTC Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.642188 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.642261 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.642276 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.642298 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.642316 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:02Z","lastTransitionTime":"2026-02-17T13:27:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.745448 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.745500 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.745512 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.745533 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.745549 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:02Z","lastTransitionTime":"2026-02-17T13:27:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.848124 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.848223 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.848246 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.848275 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.848295 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:02Z","lastTransitionTime":"2026-02-17T13:27:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.951265 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.951349 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.951364 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.951383 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:02 crc kubenswrapper[4804]: I0217 13:27:02.951393 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:02Z","lastTransitionTime":"2026-02-17T13:27:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.055058 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.055114 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.055125 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.055144 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.055157 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:03Z","lastTransitionTime":"2026-02-17T13:27:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.160172 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.160325 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.160357 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.160402 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.160446 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:03Z","lastTransitionTime":"2026-02-17T13:27:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.264778 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.264836 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.264849 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.264868 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.264880 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:03Z","lastTransitionTime":"2026-02-17T13:27:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.368058 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.368123 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.368142 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.368171 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.368190 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:03Z","lastTransitionTime":"2026-02-17T13:27:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.471667 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.471727 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.471745 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.471771 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.471789 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:03Z","lastTransitionTime":"2026-02-17T13:27:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.573330 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.573315 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.573369 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.573466 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:03 crc kubenswrapper[4804]: E0217 13:27:03.573518 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:03 crc kubenswrapper[4804]: E0217 13:27:03.573967 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:03 crc kubenswrapper[4804]: E0217 13:27:03.574051 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:03 crc kubenswrapper[4804]: E0217 13:27:03.574469 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.575250 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.575335 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.575355 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.575380 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.575399 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:03Z","lastTransitionTime":"2026-02-17T13:27:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.619794 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 09:02:24.547241814 +0000 UTC Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.678551 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.678706 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.678732 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.678771 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.678794 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:03Z","lastTransitionTime":"2026-02-17T13:27:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.782723 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.782844 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.782869 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.782897 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.782917 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:03Z","lastTransitionTime":"2026-02-17T13:27:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.886797 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.886848 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.886860 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.886881 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.886894 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:03Z","lastTransitionTime":"2026-02-17T13:27:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.990466 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.990539 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.990558 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.990586 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:03 crc kubenswrapper[4804]: I0217 13:27:03.990606 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:03Z","lastTransitionTime":"2026-02-17T13:27:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.094776 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.094831 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.094849 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.094872 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.094887 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:04Z","lastTransitionTime":"2026-02-17T13:27:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.197471 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.197537 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.197549 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.197575 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.197591 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:04Z","lastTransitionTime":"2026-02-17T13:27:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.301360 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.301436 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.301454 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.301490 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.301509 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:04Z","lastTransitionTime":"2026-02-17T13:27:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.405405 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.405486 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.405518 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.405553 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.405577 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:04Z","lastTransitionTime":"2026-02-17T13:27:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.508689 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.508751 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.508768 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.508794 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.508810 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:04Z","lastTransitionTime":"2026-02-17T13:27:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.613601 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.613664 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.613677 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.613699 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.613717 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:04Z","lastTransitionTime":"2026-02-17T13:27:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.620270 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 11:34:49.128726753 +0000 UTC Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.717569 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.717617 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.717629 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.717655 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.717669 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:04Z","lastTransitionTime":"2026-02-17T13:27:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.820338 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.820392 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.820409 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.820438 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.820462 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:04Z","lastTransitionTime":"2026-02-17T13:27:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.922646 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.922681 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.922693 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.922713 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:04 crc kubenswrapper[4804]: I0217 13:27:04.922761 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:04Z","lastTransitionTime":"2026-02-17T13:27:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.025116 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.025168 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.025180 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.025230 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.025245 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:05Z","lastTransitionTime":"2026-02-17T13:27:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.128265 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.128322 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.128335 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.128354 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.128367 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:05Z","lastTransitionTime":"2026-02-17T13:27:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.231760 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.231830 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.231848 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.231874 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.231893 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:05Z","lastTransitionTime":"2026-02-17T13:27:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.334730 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.334811 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.334833 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.334862 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.334881 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:05Z","lastTransitionTime":"2026-02-17T13:27:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.438526 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.438589 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.438608 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.438636 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.438655 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:05Z","lastTransitionTime":"2026-02-17T13:27:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.541889 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.542000 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.542027 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.542063 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.542084 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:05Z","lastTransitionTime":"2026-02-17T13:27:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.573170 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.573239 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.573264 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.573320 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:05 crc kubenswrapper[4804]: E0217 13:27:05.573411 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:05 crc kubenswrapper[4804]: E0217 13:27:05.573620 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:05 crc kubenswrapper[4804]: E0217 13:27:05.573959 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:05 crc kubenswrapper[4804]: E0217 13:27:05.574116 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.621101 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 01:23:00.111668103 +0000 UTC Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.645151 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.645282 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.645322 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.645357 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.645379 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:05Z","lastTransitionTime":"2026-02-17T13:27:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.748962 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.749043 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.749112 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.749149 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.749173 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:05Z","lastTransitionTime":"2026-02-17T13:27:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.832190 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs\") pod \"network-metrics-daemon-4jfgm\" (UID: \"e77722ba-d383-442c-b6dc-9983cf233257\") " pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:05 crc kubenswrapper[4804]: E0217 13:27:05.832536 4804 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 13:27:05 crc kubenswrapper[4804]: E0217 13:27:05.832645 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs podName:e77722ba-d383-442c-b6dc-9983cf233257 nodeName:}" failed. No retries permitted until 2026-02-17 13:28:09.832619158 +0000 UTC m=+163.944038525 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs") pod "network-metrics-daemon-4jfgm" (UID: "e77722ba-d383-442c-b6dc-9983cf233257") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.852615 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.852686 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.852711 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.852745 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.852768 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:05Z","lastTransitionTime":"2026-02-17T13:27:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.956636 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.956704 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.956724 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.956750 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:05 crc kubenswrapper[4804]: I0217 13:27:05.956766 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:05Z","lastTransitionTime":"2026-02-17T13:27:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.060319 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.060411 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.060436 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.060467 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.060489 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:06Z","lastTransitionTime":"2026-02-17T13:27:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.143462 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.143542 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.143560 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.143592 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.143612 4804 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T13:27:06Z","lastTransitionTime":"2026-02-17T13:27:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.218605 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h"] Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.219162 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h" Feb 17 13:27:06 crc kubenswrapper[4804]: W0217 13:27:06.221766 4804 reflector.go:561] object-"openshift-cluster-version"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-version": no relationship found between node 'crc' and this object Feb 17 13:27:06 crc kubenswrapper[4804]: E0217 13:27:06.221818 4804 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-version\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-version\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.223819 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.224035 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.230504 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.278270 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-4q55t" podStartSLOduration=79.278244135 podStartE2EDuration="1m19.278244135s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:06.255114443 +0000 UTC m=+100.366533790" watchObservedRunningTime="2026-02-17 13:27:06.278244135 +0000 UTC m=+100.389663482" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.299776 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=76.299748785 podStartE2EDuration="1m16.299748785s" podCreationTimestamp="2026-02-17 13:25:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:06.279044672 +0000 UTC m=+100.390464029" watchObservedRunningTime="2026-02-17 13:27:06.299748785 +0000 UTC m=+100.411168132" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.317906 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=48.317879032 podStartE2EDuration="48.317879032s" podCreationTimestamp="2026-02-17 13:26:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:06.300585381 +0000 UTC m=+100.412004728" watchObservedRunningTime="2026-02-17 13:27:06.317879032 +0000 UTC m=+100.429298379" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.339158 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-bmp2h\" (UID: \"3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.339226 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-bmp2h\" (UID: \"3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.339242 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-bmp2h\" (UID: \"3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.339261 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa-service-ca\") pod \"cluster-version-operator-5c965bbfc6-bmp2h\" (UID: \"3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.339289 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-bmp2h\" (UID: \"3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.348017 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-4fbbv" podStartSLOduration=79.347998654 podStartE2EDuration="1m19.347998654s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:06.347621643 +0000 UTC m=+100.459040990" watchObservedRunningTime="2026-02-17 13:27:06.347998654 +0000 UTC m=+100.459417991" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.383070 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-kclvs" podStartSLOduration=79.38304584 podStartE2EDuration="1m19.38304584s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:06.365982837 +0000 UTC m=+100.477402174" watchObservedRunningTime="2026-02-17 13:27:06.38304584 +0000 UTC m=+100.494465177" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.414819 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=14.414792286 podStartE2EDuration="14.414792286s" podCreationTimestamp="2026-02-17 13:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:06.413776453 +0000 UTC m=+100.525195800" watchObservedRunningTime="2026-02-17 13:27:06.414792286 +0000 UTC m=+100.526211633" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.440855 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-bmp2h\" (UID: \"3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.440916 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-bmp2h\" (UID: \"3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.440950 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-bmp2h\" (UID: \"3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.440965 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-bmp2h\" (UID: \"3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.441002 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa-service-ca\") pod \"cluster-version-operator-5c965bbfc6-bmp2h\" (UID: \"3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.441054 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-bmp2h\" (UID: \"3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.441092 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-bmp2h\" (UID: \"3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.448541 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-bmp2h\" (UID: \"3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.464864 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-bmp2h\" (UID: \"3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.496706 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ln7fh" podStartSLOduration=78.496678915 podStartE2EDuration="1m18.496678915s" podCreationTimestamp="2026-02-17 13:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:06.496264141 +0000 UTC m=+100.607683478" watchObservedRunningTime="2026-02-17 13:27:06.496678915 +0000 UTC m=+100.608098272" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.545515 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=78.545482994 podStartE2EDuration="1m18.545482994s" podCreationTimestamp="2026-02-17 13:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:06.543924612 +0000 UTC m=+100.655343959" watchObservedRunningTime="2026-02-17 13:27:06.545482994 +0000 UTC m=+100.656902331" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.579685 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podStartSLOduration=79.579667631 podStartE2EDuration="1m19.579667631s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:06.578121499 +0000 UTC m=+100.689540836" watchObservedRunningTime="2026-02-17 13:27:06.579667631 +0000 UTC m=+100.691086968" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.602319 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-z522z" podStartSLOduration=79.602290886 podStartE2EDuration="1m19.602290886s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:06.590156366 +0000 UTC m=+100.701575703" watchObservedRunningTime="2026-02-17 13:27:06.602290886 +0000 UTC m=+100.713710233" Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.621413 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 07:29:12.260451102 +0000 UTC Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.621621 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 17 13:27:06 crc kubenswrapper[4804]: I0217 13:27:06.677005 4804 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 17 13:27:07 crc kubenswrapper[4804]: I0217 13:27:07.050517 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 17 13:27:07 crc kubenswrapper[4804]: I0217 13:27:07.052134 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa-service-ca\") pod \"cluster-version-operator-5c965bbfc6-bmp2h\" (UID: \"3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h" Feb 17 13:27:07 crc kubenswrapper[4804]: I0217 13:27:07.142615 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h" Feb 17 13:27:07 crc kubenswrapper[4804]: I0217 13:27:07.573554 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:07 crc kubenswrapper[4804]: I0217 13:27:07.573648 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:07 crc kubenswrapper[4804]: E0217 13:27:07.574282 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:07 crc kubenswrapper[4804]: I0217 13:27:07.573782 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:07 crc kubenswrapper[4804]: I0217 13:27:07.573669 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:07 crc kubenswrapper[4804]: E0217 13:27:07.574413 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:07 crc kubenswrapper[4804]: E0217 13:27:07.574060 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:07 crc kubenswrapper[4804]: E0217 13:27:07.574514 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:07 crc kubenswrapper[4804]: I0217 13:27:07.665486 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h" event={"ID":"3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa","Type":"ContainerStarted","Data":"ecf158e37e36ff5370e83ad58a5c6c536f3c699310efe0e23372d83a3a73fe6a"} Feb 17 13:27:07 crc kubenswrapper[4804]: I0217 13:27:07.665571 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h" event={"ID":"3d2cf6e4-aa11-41d4-830e-d79d21f6f5fa","Type":"ContainerStarted","Data":"4b1c8010f5aaeda44df80b7740681f8f1788221ad9e605a1bac2f1b190c3f645"} Feb 17 13:27:07 crc kubenswrapper[4804]: I0217 13:27:07.684394 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bmp2h" podStartSLOduration=80.684375011 podStartE2EDuration="1m20.684375011s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:07.684290629 +0000 UTC m=+101.795710016" watchObservedRunningTime="2026-02-17 13:27:07.684375011 +0000 UTC m=+101.795794348" Feb 17 13:27:09 crc kubenswrapper[4804]: I0217 13:27:09.573009 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:09 crc kubenswrapper[4804]: I0217 13:27:09.573078 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:09 crc kubenswrapper[4804]: I0217 13:27:09.573115 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:09 crc kubenswrapper[4804]: I0217 13:27:09.573183 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:09 crc kubenswrapper[4804]: E0217 13:27:09.573322 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:09 crc kubenswrapper[4804]: E0217 13:27:09.573504 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:09 crc kubenswrapper[4804]: E0217 13:27:09.573744 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:09 crc kubenswrapper[4804]: E0217 13:27:09.574187 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:09 crc kubenswrapper[4804]: I0217 13:27:09.574482 4804 scope.go:117] "RemoveContainer" containerID="6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe" Feb 17 13:27:09 crc kubenswrapper[4804]: E0217 13:27:09.574663 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" Feb 17 13:27:11 crc kubenswrapper[4804]: I0217 13:27:11.573390 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:11 crc kubenswrapper[4804]: I0217 13:27:11.573478 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:11 crc kubenswrapper[4804]: I0217 13:27:11.574582 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:11 crc kubenswrapper[4804]: I0217 13:27:11.574591 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:11 crc kubenswrapper[4804]: E0217 13:27:11.574694 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:11 crc kubenswrapper[4804]: E0217 13:27:11.574798 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:11 crc kubenswrapper[4804]: E0217 13:27:11.574830 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:11 crc kubenswrapper[4804]: E0217 13:27:11.574904 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:11 crc kubenswrapper[4804]: I0217 13:27:11.590721 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 17 13:27:13 crc kubenswrapper[4804]: I0217 13:27:13.573742 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:13 crc kubenswrapper[4804]: I0217 13:27:13.573774 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:13 crc kubenswrapper[4804]: E0217 13:27:13.573908 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:13 crc kubenswrapper[4804]: I0217 13:27:13.573993 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:13 crc kubenswrapper[4804]: I0217 13:27:13.574075 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:13 crc kubenswrapper[4804]: E0217 13:27:13.574022 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:13 crc kubenswrapper[4804]: E0217 13:27:13.574290 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:13 crc kubenswrapper[4804]: E0217 13:27:13.574558 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:15 crc kubenswrapper[4804]: I0217 13:27:15.573277 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:15 crc kubenswrapper[4804]: I0217 13:27:15.573299 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:15 crc kubenswrapper[4804]: E0217 13:27:15.573459 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:15 crc kubenswrapper[4804]: I0217 13:27:15.573732 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:15 crc kubenswrapper[4804]: E0217 13:27:15.573824 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:15 crc kubenswrapper[4804]: E0217 13:27:15.573985 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:15 crc kubenswrapper[4804]: I0217 13:27:15.574189 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:15 crc kubenswrapper[4804]: E0217 13:27:15.574458 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:16 crc kubenswrapper[4804]: I0217 13:27:16.608659 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=5.608636632 podStartE2EDuration="5.608636632s" podCreationTimestamp="2026-02-17 13:27:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:16.606190682 +0000 UTC m=+110.717610019" watchObservedRunningTime="2026-02-17 13:27:16.608636632 +0000 UTC m=+110.720055989" Feb 17 13:27:17 crc kubenswrapper[4804]: I0217 13:27:17.574020 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:17 crc kubenswrapper[4804]: I0217 13:27:17.574104 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:17 crc kubenswrapper[4804]: I0217 13:27:17.574153 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:17 crc kubenswrapper[4804]: E0217 13:27:17.574314 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:17 crc kubenswrapper[4804]: I0217 13:27:17.574335 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:17 crc kubenswrapper[4804]: E0217 13:27:17.574481 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:17 crc kubenswrapper[4804]: E0217 13:27:17.574578 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:17 crc kubenswrapper[4804]: E0217 13:27:17.574624 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:19 crc kubenswrapper[4804]: I0217 13:27:19.573103 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:19 crc kubenswrapper[4804]: I0217 13:27:19.573105 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:19 crc kubenswrapper[4804]: I0217 13:27:19.573178 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:19 crc kubenswrapper[4804]: I0217 13:27:19.573249 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:19 crc kubenswrapper[4804]: E0217 13:27:19.573383 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:19 crc kubenswrapper[4804]: E0217 13:27:19.573477 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:19 crc kubenswrapper[4804]: E0217 13:27:19.573702 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:19 crc kubenswrapper[4804]: E0217 13:27:19.573930 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:20 crc kubenswrapper[4804]: I0217 13:27:20.575479 4804 scope.go:117] "RemoveContainer" containerID="6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe" Feb 17 13:27:20 crc kubenswrapper[4804]: E0217 13:27:20.575791 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-v8mv6_openshift-ovn-kubernetes(8df4e52a-e578-472b-a6b3-418e9755714f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" Feb 17 13:27:21 crc kubenswrapper[4804]: I0217 13:27:21.574422 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:21 crc kubenswrapper[4804]: I0217 13:27:21.575035 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:21 crc kubenswrapper[4804]: I0217 13:27:21.575062 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:21 crc kubenswrapper[4804]: E0217 13:27:21.575269 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:21 crc kubenswrapper[4804]: I0217 13:27:21.576170 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:21 crc kubenswrapper[4804]: E0217 13:27:21.576488 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:21 crc kubenswrapper[4804]: E0217 13:27:21.576626 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:21 crc kubenswrapper[4804]: E0217 13:27:21.576764 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:22 crc kubenswrapper[4804]: I0217 13:27:22.720392 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kclvs_42eec48d-c990-43e6-8348-d9f78997ec3b/kube-multus/1.log" Feb 17 13:27:22 crc kubenswrapper[4804]: I0217 13:27:22.722024 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kclvs_42eec48d-c990-43e6-8348-d9f78997ec3b/kube-multus/0.log" Feb 17 13:27:22 crc kubenswrapper[4804]: I0217 13:27:22.722105 4804 generic.go:334] "Generic (PLEG): container finished" podID="42eec48d-c990-43e6-8348-d9f78997ec3b" containerID="2df81c301857cf67d2883ea019d2d5fbd31c4c5dea2d9b5c4bfb19302b4cb03a" exitCode=1 Feb 17 13:27:22 crc kubenswrapper[4804]: I0217 13:27:22.722170 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kclvs" event={"ID":"42eec48d-c990-43e6-8348-d9f78997ec3b","Type":"ContainerDied","Data":"2df81c301857cf67d2883ea019d2d5fbd31c4c5dea2d9b5c4bfb19302b4cb03a"} Feb 17 13:27:22 crc kubenswrapper[4804]: I0217 13:27:22.722277 4804 scope.go:117] "RemoveContainer" containerID="26414c5f4c2607626e125b0a1fa92effaa7a4ba5c807a051e6a959db2b2181aa" Feb 17 13:27:22 crc kubenswrapper[4804]: I0217 13:27:22.723084 4804 scope.go:117] "RemoveContainer" containerID="2df81c301857cf67d2883ea019d2d5fbd31c4c5dea2d9b5c4bfb19302b4cb03a" Feb 17 13:27:22 crc kubenswrapper[4804]: E0217 13:27:22.723454 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-kclvs_openshift-multus(42eec48d-c990-43e6-8348-d9f78997ec3b)\"" pod="openshift-multus/multus-kclvs" podUID="42eec48d-c990-43e6-8348-d9f78997ec3b" Feb 17 13:27:23 crc kubenswrapper[4804]: I0217 13:27:23.573290 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:23 crc kubenswrapper[4804]: I0217 13:27:23.573373 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:23 crc kubenswrapper[4804]: I0217 13:27:23.573289 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:23 crc kubenswrapper[4804]: I0217 13:27:23.573494 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:23 crc kubenswrapper[4804]: E0217 13:27:23.573430 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:23 crc kubenswrapper[4804]: E0217 13:27:23.573568 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:23 crc kubenswrapper[4804]: E0217 13:27:23.573779 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:23 crc kubenswrapper[4804]: E0217 13:27:23.573873 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:23 crc kubenswrapper[4804]: I0217 13:27:23.726417 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kclvs_42eec48d-c990-43e6-8348-d9f78997ec3b/kube-multus/1.log" Feb 17 13:27:25 crc kubenswrapper[4804]: I0217 13:27:25.573983 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:25 crc kubenswrapper[4804]: E0217 13:27:25.574113 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:25 crc kubenswrapper[4804]: I0217 13:27:25.574260 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:25 crc kubenswrapper[4804]: I0217 13:27:25.574260 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:25 crc kubenswrapper[4804]: E0217 13:27:25.574427 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:25 crc kubenswrapper[4804]: E0217 13:27:25.574546 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:25 crc kubenswrapper[4804]: I0217 13:27:25.574296 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:25 crc kubenswrapper[4804]: E0217 13:27:25.574651 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:26 crc kubenswrapper[4804]: E0217 13:27:26.590463 4804 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 17 13:27:26 crc kubenswrapper[4804]: E0217 13:27:26.710666 4804 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 13:27:27 crc kubenswrapper[4804]: I0217 13:27:27.573486 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:27 crc kubenswrapper[4804]: I0217 13:27:27.573526 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:27 crc kubenswrapper[4804]: I0217 13:27:27.573631 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:27 crc kubenswrapper[4804]: E0217 13:27:27.573692 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:27 crc kubenswrapper[4804]: E0217 13:27:27.573773 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:27 crc kubenswrapper[4804]: E0217 13:27:27.573922 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:27 crc kubenswrapper[4804]: I0217 13:27:27.574393 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:27 crc kubenswrapper[4804]: E0217 13:27:27.574539 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:29 crc kubenswrapper[4804]: I0217 13:27:29.573237 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:29 crc kubenswrapper[4804]: I0217 13:27:29.573313 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:29 crc kubenswrapper[4804]: E0217 13:27:29.573431 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:29 crc kubenswrapper[4804]: I0217 13:27:29.573513 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:29 crc kubenswrapper[4804]: E0217 13:27:29.573641 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:29 crc kubenswrapper[4804]: I0217 13:27:29.573706 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:29 crc kubenswrapper[4804]: E0217 13:27:29.573853 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:29 crc kubenswrapper[4804]: E0217 13:27:29.574027 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:31 crc kubenswrapper[4804]: I0217 13:27:31.573573 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:31 crc kubenswrapper[4804]: I0217 13:27:31.573617 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:31 crc kubenswrapper[4804]: E0217 13:27:31.573770 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:31 crc kubenswrapper[4804]: E0217 13:27:31.573954 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:31 crc kubenswrapper[4804]: I0217 13:27:31.573959 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:31 crc kubenswrapper[4804]: E0217 13:27:31.574074 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:31 crc kubenswrapper[4804]: I0217 13:27:31.573992 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:31 crc kubenswrapper[4804]: E0217 13:27:31.574455 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:31 crc kubenswrapper[4804]: E0217 13:27:31.712569 4804 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 13:27:33 crc kubenswrapper[4804]: I0217 13:27:33.573756 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:33 crc kubenswrapper[4804]: I0217 13:27:33.573889 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:33 crc kubenswrapper[4804]: E0217 13:27:33.573957 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:33 crc kubenswrapper[4804]: I0217 13:27:33.574048 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:33 crc kubenswrapper[4804]: I0217 13:27:33.574235 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:33 crc kubenswrapper[4804]: E0217 13:27:33.574243 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:33 crc kubenswrapper[4804]: E0217 13:27:33.574382 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:33 crc kubenswrapper[4804]: E0217 13:27:33.574515 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:34 crc kubenswrapper[4804]: I0217 13:27:34.574415 4804 scope.go:117] "RemoveContainer" containerID="6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe" Feb 17 13:27:34 crc kubenswrapper[4804]: I0217 13:27:34.770796 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v8mv6_8df4e52a-e578-472b-a6b3-418e9755714f/ovnkube-controller/3.log" Feb 17 13:27:34 crc kubenswrapper[4804]: I0217 13:27:34.773767 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerStarted","Data":"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96"} Feb 17 13:27:34 crc kubenswrapper[4804]: I0217 13:27:34.775449 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:27:34 crc kubenswrapper[4804]: I0217 13:27:34.822145 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" podStartSLOduration=107.822119374 podStartE2EDuration="1m47.822119374s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:34.820460109 +0000 UTC m=+128.931879486" watchObservedRunningTime="2026-02-17 13:27:34.822119374 +0000 UTC m=+128.933538751" Feb 17 13:27:35 crc kubenswrapper[4804]: I0217 13:27:35.489036 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4jfgm"] Feb 17 13:27:35 crc kubenswrapper[4804]: I0217 13:27:35.489165 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:35 crc kubenswrapper[4804]: E0217 13:27:35.489271 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:35 crc kubenswrapper[4804]: I0217 13:27:35.573300 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:35 crc kubenswrapper[4804]: I0217 13:27:35.573333 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:35 crc kubenswrapper[4804]: I0217 13:27:35.573293 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:35 crc kubenswrapper[4804]: E0217 13:27:35.573446 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:35 crc kubenswrapper[4804]: E0217 13:27:35.573587 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:35 crc kubenswrapper[4804]: E0217 13:27:35.573649 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:36 crc kubenswrapper[4804]: E0217 13:27:36.713826 4804 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 13:27:37 crc kubenswrapper[4804]: I0217 13:27:37.574001 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:37 crc kubenswrapper[4804]: I0217 13:27:37.574040 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:37 crc kubenswrapper[4804]: I0217 13:27:37.574094 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:37 crc kubenswrapper[4804]: E0217 13:27:37.574330 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:37 crc kubenswrapper[4804]: I0217 13:27:37.574372 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:37 crc kubenswrapper[4804]: E0217 13:27:37.574508 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:37 crc kubenswrapper[4804]: I0217 13:27:37.574769 4804 scope.go:117] "RemoveContainer" containerID="2df81c301857cf67d2883ea019d2d5fbd31c4c5dea2d9b5c4bfb19302b4cb03a" Feb 17 13:27:37 crc kubenswrapper[4804]: E0217 13:27:37.574904 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:37 crc kubenswrapper[4804]: E0217 13:27:37.575184 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:37 crc kubenswrapper[4804]: I0217 13:27:37.789430 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kclvs_42eec48d-c990-43e6-8348-d9f78997ec3b/kube-multus/1.log" Feb 17 13:27:38 crc kubenswrapper[4804]: I0217 13:27:38.797243 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kclvs_42eec48d-c990-43e6-8348-d9f78997ec3b/kube-multus/1.log" Feb 17 13:27:38 crc kubenswrapper[4804]: I0217 13:27:38.797326 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kclvs" event={"ID":"42eec48d-c990-43e6-8348-d9f78997ec3b","Type":"ContainerStarted","Data":"89324956d07c3785169619878354108e896d85eaace9f0e642b1b5ee9a981bde"} Feb 17 13:27:39 crc kubenswrapper[4804]: I0217 13:27:39.573690 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:39 crc kubenswrapper[4804]: I0217 13:27:39.573733 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:39 crc kubenswrapper[4804]: I0217 13:27:39.573713 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:39 crc kubenswrapper[4804]: I0217 13:27:39.573855 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:39 crc kubenswrapper[4804]: E0217 13:27:39.575084 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:39 crc kubenswrapper[4804]: E0217 13:27:39.575316 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:39 crc kubenswrapper[4804]: E0217 13:27:39.575513 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:39 crc kubenswrapper[4804]: E0217 13:27:39.575617 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:41 crc kubenswrapper[4804]: I0217 13:27:41.572950 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:41 crc kubenswrapper[4804]: E0217 13:27:41.574351 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 13:27:41 crc kubenswrapper[4804]: I0217 13:27:41.572985 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:41 crc kubenswrapper[4804]: E0217 13:27:41.574698 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 13:27:41 crc kubenswrapper[4804]: I0217 13:27:41.572962 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:41 crc kubenswrapper[4804]: I0217 13:27:41.573155 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:41 crc kubenswrapper[4804]: E0217 13:27:41.575031 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jfgm" podUID="e77722ba-d383-442c-b6dc-9983cf233257" Feb 17 13:27:41 crc kubenswrapper[4804]: E0217 13:27:41.575245 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 13:27:43 crc kubenswrapper[4804]: I0217 13:27:43.573854 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:27:43 crc kubenswrapper[4804]: I0217 13:27:43.573930 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:43 crc kubenswrapper[4804]: I0217 13:27:43.573850 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:43 crc kubenswrapper[4804]: I0217 13:27:43.575478 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:43 crc kubenswrapper[4804]: I0217 13:27:43.578614 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 17 13:27:43 crc kubenswrapper[4804]: I0217 13:27:43.579425 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 17 13:27:43 crc kubenswrapper[4804]: I0217 13:27:43.579598 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 17 13:27:43 crc kubenswrapper[4804]: I0217 13:27:43.579858 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 17 13:27:43 crc kubenswrapper[4804]: I0217 13:27:43.581926 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 17 13:27:43 crc kubenswrapper[4804]: I0217 13:27:43.582168 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.909973 4804 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.958810 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv"] Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.959461 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.965583 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.967037 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.967570 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.967570 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.971972 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v6t2f"] Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.972677 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v6t2f" Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.973788 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.974046 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.975491 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.976883 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-v8xf8"] Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.978045 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-w4nl5"] Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.978383 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.978732 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-w4nl5" Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.979789 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-tz5vz"] Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.980592 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.982055 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-v8xf8" Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.983162 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7rw"] Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.984185 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7rw" Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.986519 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-62vhn"] Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.987031 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-bpzqw"] Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.987463 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-bpzqw" Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.987468 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-62vhn" Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.993855 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-h48zc"] Feb 17 13:27:46 crc kubenswrapper[4804]: I0217 13:27:46.994341 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.035636 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a8b8b00-f9df-45f1-97af-4d88c02a4d98-serving-cert\") pod \"authentication-operator-69f744f599-v8xf8\" (UID: \"8a8b8b00-f9df-45f1-97af-4d88c02a4d98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8xf8" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.035689 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-service-ca\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.035718 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b1f4b64e-0bcd-420e-a4d6-80918348ed75-auth-proxy-config\") pod \"machine-approver-56656f9798-62vhn\" (UID: \"b1f4b64e-0bcd-420e-a4d6-80918348ed75\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-62vhn" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.035744 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-879pl\" (UniqueName: \"kubernetes.io/projected/4c36b00a-bd3f-424c-a67b-d828d782e60f-kube-api-access-879pl\") pod \"downloads-7954f5f757-w4nl5\" (UID: \"4c36b00a-bd3f-424c-a67b-d828d782e60f\") " pod="openshift-console/downloads-7954f5f757-w4nl5" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.035766 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-trusted-ca-bundle\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.035791 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a8b8b00-f9df-45f1-97af-4d88c02a4d98-service-ca-bundle\") pod \"authentication-operator-69f744f599-v8xf8\" (UID: \"8a8b8b00-f9df-45f1-97af-4d88c02a4d98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8xf8" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.035814 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-oauth-serving-cert\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.035836 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a8b8b00-f9df-45f1-97af-4d88c02a4d98-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-v8xf8\" (UID: \"8a8b8b00-f9df-45f1-97af-4d88c02a4d98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8xf8" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.035870 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s664q\" (UniqueName: \"kubernetes.io/projected/88e84359-a2f8-4d55-96e4-fda2ff226372-kube-api-access-s664q\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.035892 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6318da7c-2891-47a5-bebf-edd3da5a103a-etcd-service-ca\") pod \"etcd-operator-b45778765-h48zc\" (UID: \"6318da7c-2891-47a5-bebf-edd3da5a103a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.035916 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7fc9b807-491e-4540-80ba-dd9243fa514c-metrics-tls\") pod \"dns-operator-744455d44c-bpzqw\" (UID: \"7fc9b807-491e-4540-80ba-dd9243fa514c\") " pod="openshift-dns-operator/dns-operator-744455d44c-bpzqw" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.035939 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-console-config\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.035965 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6318da7c-2891-47a5-bebf-edd3da5a103a-serving-cert\") pod \"etcd-operator-b45778765-h48zc\" (UID: \"6318da7c-2891-47a5-bebf-edd3da5a103a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.035987 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0593410f-5966-4c13-9978-dbb0dee5faab-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fw7rw\" (UID: \"0593410f-5966-4c13-9978-dbb0dee5faab\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7rw" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036015 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f4f3a44-0dd9-49eb-bbb8-ada255b278de-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-v6t2f\" (UID: \"2f4f3a44-0dd9-49eb-bbb8-ada255b278de\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v6t2f" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036057 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-console-oauth-config\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036097 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/b1f4b64e-0bcd-420e-a4d6-80918348ed75-machine-approver-tls\") pod \"machine-approver-56656f9798-62vhn\" (UID: \"b1f4b64e-0bcd-420e-a4d6-80918348ed75\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-62vhn" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036124 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2f4f3a44-0dd9-49eb-bbb8-ada255b278de-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-v6t2f\" (UID: \"2f4f3a44-0dd9-49eb-bbb8-ada255b278de\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v6t2f" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036156 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-console-serving-cert\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036292 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a8b8b00-f9df-45f1-97af-4d88c02a4d98-config\") pod \"authentication-operator-69f744f599-v8xf8\" (UID: \"8a8b8b00-f9df-45f1-97af-4d88c02a4d98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8xf8" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036339 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dc5q\" (UniqueName: \"kubernetes.io/projected/0593410f-5966-4c13-9978-dbb0dee5faab-kube-api-access-5dc5q\") pod \"cluster-samples-operator-665b6dd947-fw7rw\" (UID: \"0593410f-5966-4c13-9978-dbb0dee5faab\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7rw" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036362 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6318da7c-2891-47a5-bebf-edd3da5a103a-etcd-ca\") pod \"etcd-operator-b45778765-h48zc\" (UID: \"6318da7c-2891-47a5-bebf-edd3da5a103a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036402 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6nrg\" (UniqueName: \"kubernetes.io/projected/8a8b8b00-f9df-45f1-97af-4d88c02a4d98-kube-api-access-x6nrg\") pod \"authentication-operator-69f744f599-v8xf8\" (UID: \"8a8b8b00-f9df-45f1-97af-4d88c02a4d98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8xf8" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036435 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/88e84359-a2f8-4d55-96e4-fda2ff226372-audit-policies\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036461 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88e84359-a2f8-4d55-96e4-fda2ff226372-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036486 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/88e84359-a2f8-4d55-96e4-fda2ff226372-audit-dir\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036616 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq22q\" (UniqueName: \"kubernetes.io/projected/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-kube-api-access-xq22q\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036679 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/88e84359-a2f8-4d55-96e4-fda2ff226372-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036747 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88e84359-a2f8-4d55-96e4-fda2ff226372-serving-cert\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036787 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2f4f3a44-0dd9-49eb-bbb8-ada255b278de-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-v6t2f\" (UID: \"2f4f3a44-0dd9-49eb-bbb8-ada255b278de\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v6t2f" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036826 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6318da7c-2891-47a5-bebf-edd3da5a103a-etcd-client\") pod \"etcd-operator-b45778765-h48zc\" (UID: \"6318da7c-2891-47a5-bebf-edd3da5a103a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036866 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1f4b64e-0bcd-420e-a4d6-80918348ed75-config\") pod \"machine-approver-56656f9798-62vhn\" (UID: \"b1f4b64e-0bcd-420e-a4d6-80918348ed75\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-62vhn" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036900 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kp4p\" (UniqueName: \"kubernetes.io/projected/b1f4b64e-0bcd-420e-a4d6-80918348ed75-kube-api-access-6kp4p\") pod \"machine-approver-56656f9798-62vhn\" (UID: \"b1f4b64e-0bcd-420e-a4d6-80918348ed75\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-62vhn" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036935 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/88e84359-a2f8-4d55-96e4-fda2ff226372-encryption-config\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.036967 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmjld\" (UniqueName: \"kubernetes.io/projected/6318da7c-2891-47a5-bebf-edd3da5a103a-kube-api-access-cmjld\") pod \"etcd-operator-b45778765-h48zc\" (UID: \"6318da7c-2891-47a5-bebf-edd3da5a103a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.037004 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6318da7c-2891-47a5-bebf-edd3da5a103a-config\") pod \"etcd-operator-b45778765-h48zc\" (UID: \"6318da7c-2891-47a5-bebf-edd3da5a103a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.037034 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf85x\" (UniqueName: \"kubernetes.io/projected/7fc9b807-491e-4540-80ba-dd9243fa514c-kube-api-access-qf85x\") pod \"dns-operator-744455d44c-bpzqw\" (UID: \"7fc9b807-491e-4540-80ba-dd9243fa514c\") " pod="openshift-dns-operator/dns-operator-744455d44c-bpzqw" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.037064 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/88e84359-a2f8-4d55-96e4-fda2ff226372-etcd-client\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.037111 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snrpl\" (UniqueName: \"kubernetes.io/projected/2f4f3a44-0dd9-49eb-bbb8-ada255b278de-kube-api-access-snrpl\") pod \"cluster-image-registry-operator-dc59b4c8b-v6t2f\" (UID: \"2f4f3a44-0dd9-49eb-bbb8-ada255b278de\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v6t2f" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.084540 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.084563 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.084663 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.085722 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-46w22"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.086377 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.093029 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.093293 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.093541 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.098589 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.117107 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.117222 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.117404 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.117529 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.117831 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.118075 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.118260 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.118299 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.118376 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.118429 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.118494 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.118676 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.118724 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.118823 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.118952 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.118991 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.118269 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.119252 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.119355 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.119459 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.124167 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.124659 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.125689 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.125970 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4shqj"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.126127 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.126428 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.126531 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4shqj" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.126564 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.126643 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.126732 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.128084 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.128171 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.130219 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.131025 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9t2lx"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.131250 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.131416 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9t2lx" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.131532 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.132095 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.131607 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-spfls"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.145224 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.132163 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.132224 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.132269 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.132319 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.132365 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.132405 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.132446 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.132491 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.132530 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.133176 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.144161 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.131789 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.145580 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.145814 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-j8ggj"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.138035 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3ea797e4-54e0-4063-8d2b-647f6686e2a8-audit-dir\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.145951 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6318da7c-2891-47a5-bebf-edd3da5a103a-etcd-service-ca\") pod \"etcd-operator-b45778765-h48zc\" (UID: \"6318da7c-2891-47a5-bebf-edd3da5a103a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.145990 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ea797e4-54e0-4063-8d2b-647f6686e2a8-config\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.146013 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7fc9b807-491e-4540-80ba-dd9243fa514c-metrics-tls\") pod \"dns-operator-744455d44c-bpzqw\" (UID: \"7fc9b807-491e-4540-80ba-dd9243fa514c\") " pod="openshift-dns-operator/dns-operator-744455d44c-bpzqw" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.146029 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6318da7c-2891-47a5-bebf-edd3da5a103a-serving-cert\") pod \"etcd-operator-b45778765-h48zc\" (UID: \"6318da7c-2891-47a5-bebf-edd3da5a103a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.146042 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-mcszv"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.146048 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3ea797e4-54e0-4063-8d2b-647f6686e2a8-image-import-ca\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.146068 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-console-config\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.146087 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0593410f-5966-4c13-9978-dbb0dee5faab-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fw7rw\" (UID: \"0593410f-5966-4c13-9978-dbb0dee5faab\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7rw" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.146113 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f4f3a44-0dd9-49eb-bbb8-ada255b278de-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-v6t2f\" (UID: \"2f4f3a44-0dd9-49eb-bbb8-ada255b278de\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v6t2f" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.146129 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-console-oauth-config\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.146144 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-console-serving-cert\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.146174 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/b1f4b64e-0bcd-420e-a4d6-80918348ed75-machine-approver-tls\") pod \"machine-approver-56656f9798-62vhn\" (UID: \"b1f4b64e-0bcd-420e-a4d6-80918348ed75\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-62vhn" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.146191 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2f4f3a44-0dd9-49eb-bbb8-ada255b278de-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-v6t2f\" (UID: \"2f4f3a44-0dd9-49eb-bbb8-ada255b278de\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v6t2f" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.146253 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dc5q\" (UniqueName: \"kubernetes.io/projected/0593410f-5966-4c13-9978-dbb0dee5faab-kube-api-access-5dc5q\") pod \"cluster-samples-operator-665b6dd947-fw7rw\" (UID: \"0593410f-5966-4c13-9978-dbb0dee5faab\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7rw" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.146295 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6318da7c-2891-47a5-bebf-edd3da5a103a-etcd-ca\") pod \"etcd-operator-b45778765-h48zc\" (UID: \"6318da7c-2891-47a5-bebf-edd3da5a103a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.146322 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a8b8b00-f9df-45f1-97af-4d88c02a4d98-config\") pod \"authentication-operator-69f744f599-v8xf8\" (UID: \"8a8b8b00-f9df-45f1-97af-4d88c02a4d98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8xf8" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.146339 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/88e84359-a2f8-4d55-96e4-fda2ff226372-audit-policies\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.146354 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88e84359-a2f8-4d55-96e4-fda2ff226372-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.146370 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/88e84359-a2f8-4d55-96e4-fda2ff226372-audit-dir\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.149116 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-console-config\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.149763 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6318da7c-2891-47a5-bebf-edd3da5a103a-etcd-service-ca\") pod \"etcd-operator-b45778765-h48zc\" (UID: \"6318da7c-2891-47a5-bebf-edd3da5a103a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150143 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6nrg\" (UniqueName: \"kubernetes.io/projected/8a8b8b00-f9df-45f1-97af-4d88c02a4d98-kube-api-access-x6nrg\") pod \"authentication-operator-69f744f599-v8xf8\" (UID: \"8a8b8b00-f9df-45f1-97af-4d88c02a4d98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8xf8" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150177 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3ea797e4-54e0-4063-8d2b-647f6686e2a8-node-pullsecrets\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150218 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xq22q\" (UniqueName: \"kubernetes.io/projected/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-kube-api-access-xq22q\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150249 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/88e84359-a2f8-4d55-96e4-fda2ff226372-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150300 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggj9b\" (UniqueName: \"kubernetes.io/projected/3ea797e4-54e0-4063-8d2b-647f6686e2a8-kube-api-access-ggj9b\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150316 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3ea797e4-54e0-4063-8d2b-647f6686e2a8-etcd-serving-ca\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150336 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88e84359-a2f8-4d55-96e4-fda2ff226372-serving-cert\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150354 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1f4b64e-0bcd-420e-a4d6-80918348ed75-config\") pod \"machine-approver-56656f9798-62vhn\" (UID: \"b1f4b64e-0bcd-420e-a4d6-80918348ed75\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-62vhn" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150368 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kp4p\" (UniqueName: \"kubernetes.io/projected/b1f4b64e-0bcd-420e-a4d6-80918348ed75-kube-api-access-6kp4p\") pod \"machine-approver-56656f9798-62vhn\" (UID: \"b1f4b64e-0bcd-420e-a4d6-80918348ed75\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-62vhn" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150385 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2f4f3a44-0dd9-49eb-bbb8-ada255b278de-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-v6t2f\" (UID: \"2f4f3a44-0dd9-49eb-bbb8-ada255b278de\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v6t2f" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150405 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6318da7c-2891-47a5-bebf-edd3da5a103a-etcd-client\") pod \"etcd-operator-b45778765-h48zc\" (UID: \"6318da7c-2891-47a5-bebf-edd3da5a103a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150423 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmjld\" (UniqueName: \"kubernetes.io/projected/6318da7c-2891-47a5-bebf-edd3da5a103a-kube-api-access-cmjld\") pod \"etcd-operator-b45778765-h48zc\" (UID: \"6318da7c-2891-47a5-bebf-edd3da5a103a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150440 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/88e84359-a2f8-4d55-96e4-fda2ff226372-encryption-config\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150458 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6318da7c-2891-47a5-bebf-edd3da5a103a-config\") pod \"etcd-operator-b45778765-h48zc\" (UID: \"6318da7c-2891-47a5-bebf-edd3da5a103a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150473 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf85x\" (UniqueName: \"kubernetes.io/projected/7fc9b807-491e-4540-80ba-dd9243fa514c-kube-api-access-qf85x\") pod \"dns-operator-744455d44c-bpzqw\" (UID: \"7fc9b807-491e-4540-80ba-dd9243fa514c\") " pod="openshift-dns-operator/dns-operator-744455d44c-bpzqw" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150498 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/88e84359-a2f8-4d55-96e4-fda2ff226372-etcd-client\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150519 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snrpl\" (UniqueName: \"kubernetes.io/projected/2f4f3a44-0dd9-49eb-bbb8-ada255b278de-kube-api-access-snrpl\") pod \"cluster-image-registry-operator-dc59b4c8b-v6t2f\" (UID: \"2f4f3a44-0dd9-49eb-bbb8-ada255b278de\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v6t2f" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150545 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a8b8b00-f9df-45f1-97af-4d88c02a4d98-serving-cert\") pod \"authentication-operator-69f744f599-v8xf8\" (UID: \"8a8b8b00-f9df-45f1-97af-4d88c02a4d98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8xf8" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150561 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3ea797e4-54e0-4063-8d2b-647f6686e2a8-etcd-client\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150579 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-service-ca\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150599 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-879pl\" (UniqueName: \"kubernetes.io/projected/4c36b00a-bd3f-424c-a67b-d828d782e60f-kube-api-access-879pl\") pod \"downloads-7954f5f757-w4nl5\" (UID: \"4c36b00a-bd3f-424c-a67b-d828d782e60f\") " pod="openshift-console/downloads-7954f5f757-w4nl5" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150617 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b1f4b64e-0bcd-420e-a4d6-80918348ed75-auth-proxy-config\") pod \"machine-approver-56656f9798-62vhn\" (UID: \"b1f4b64e-0bcd-420e-a4d6-80918348ed75\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-62vhn" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150634 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ea797e4-54e0-4063-8d2b-647f6686e2a8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150651 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3ea797e4-54e0-4063-8d2b-647f6686e2a8-audit\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150674 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-trusted-ca-bundle\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150694 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a8b8b00-f9df-45f1-97af-4d88c02a4d98-service-ca-bundle\") pod \"authentication-operator-69f744f599-v8xf8\" (UID: \"8a8b8b00-f9df-45f1-97af-4d88c02a4d98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8xf8" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150709 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3ea797e4-54e0-4063-8d2b-647f6686e2a8-encryption-config\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150726 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ea797e4-54e0-4063-8d2b-647f6686e2a8-serving-cert\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150746 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-oauth-serving-cert\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150769 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a8b8b00-f9df-45f1-97af-4d88c02a4d98-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-v8xf8\" (UID: \"8a8b8b00-f9df-45f1-97af-4d88c02a4d98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8xf8" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.150793 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s664q\" (UniqueName: \"kubernetes.io/projected/88e84359-a2f8-4d55-96e4-fda2ff226372-kube-api-access-s664q\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.151321 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.151523 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.151536 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6318da7c-2891-47a5-bebf-edd3da5a103a-etcd-ca\") pod \"etcd-operator-b45778765-h48zc\" (UID: \"6318da7c-2891-47a5-bebf-edd3da5a103a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.151730 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.151867 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.151976 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.152000 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a8b8b00-f9df-45f1-97af-4d88c02a4d98-config\") pod \"authentication-operator-69f744f599-v8xf8\" (UID: \"8a8b8b00-f9df-45f1-97af-4d88c02a4d98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8xf8" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.152083 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.152495 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/88e84359-a2f8-4d55-96e4-fda2ff226372-audit-policies\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.152612 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-console-oauth-config\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.168331 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.169629 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.170918 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6318da7c-2891-47a5-bebf-edd3da5a103a-serving-cert\") pod \"etcd-operator-b45778765-h48zc\" (UID: \"6318da7c-2891-47a5-bebf-edd3da5a103a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.171344 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ggf6k"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.171866 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7fc9b807-491e-4540-80ba-dd9243fa514c-metrics-tls\") pod \"dns-operator-744455d44c-bpzqw\" (UID: \"7fc9b807-491e-4540-80ba-dd9243fa514c\") " pod="openshift-dns-operator/dns-operator-744455d44c-bpzqw" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.172164 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88e84359-a2f8-4d55-96e4-fda2ff226372-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.172342 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-console-serving-cert\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.172563 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-spfls" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.173386 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.176068 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.189431 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.190045 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/b1f4b64e-0bcd-420e-a4d6-80918348ed75-machine-approver-tls\") pod \"machine-approver-56656f9798-62vhn\" (UID: \"b1f4b64e-0bcd-420e-a4d6-80918348ed75\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-62vhn" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.191389 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d5pqr"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.191618 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0593410f-5966-4c13-9978-dbb0dee5faab-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fw7rw\" (UID: \"0593410f-5966-4c13-9978-dbb0dee5faab\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7rw" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.191879 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.192156 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-oauth-serving-cert\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.174612 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/88e84359-a2f8-4d55-96e4-fda2ff226372-audit-dir\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.192233 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.192389 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-kbpk6"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.192722 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b1f4b64e-0bcd-420e-a4d6-80918348ed75-auth-proxy-config\") pod \"machine-approver-56656f9798-62vhn\" (UID: \"b1f4b64e-0bcd-420e-a4d6-80918348ed75\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-62vhn" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.192736 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tzzkm"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.193212 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bstw9"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.193591 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjbls"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.193920 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gffmb"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.197469 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/88e84359-a2f8-4d55-96e4-fda2ff226372-encryption-config\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.197937 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.198171 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-mcszv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.198488 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gffmb" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.199508 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-service-ca\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.200018 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6318da7c-2891-47a5-bebf-edd3da5a103a-config\") pod \"etcd-operator-b45778765-h48zc\" (UID: \"6318da7c-2891-47a5-bebf-edd3da5a103a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.200092 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.200444 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d5pqr" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.200746 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.201043 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.201367 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tzzkm" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.203787 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.204676 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1f4b64e-0bcd-420e-a4d6-80918348ed75-config\") pod \"machine-approver-56656f9798-62vhn\" (UID: \"b1f4b64e-0bcd-420e-a4d6-80918348ed75\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-62vhn" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.205885 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.208035 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a8b8b00-f9df-45f1-97af-4d88c02a4d98-service-ca-bundle\") pod \"authentication-operator-69f744f599-v8xf8\" (UID: \"8a8b8b00-f9df-45f1-97af-4d88c02a4d98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8xf8" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.209323 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a8b8b00-f9df-45f1-97af-4d88c02a4d98-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-v8xf8\" (UID: \"8a8b8b00-f9df-45f1-97af-4d88c02a4d98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8xf8" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.209574 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-trusted-ca-bundle\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.210273 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.210306 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjbls" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.210491 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f4f3a44-0dd9-49eb-bbb8-ada255b278de-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-v6t2f\" (UID: \"2f4f3a44-0dd9-49eb-bbb8-ada255b278de\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v6t2f" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.211945 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/88e84359-a2f8-4d55-96e4-fda2ff226372-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.213529 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.213942 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2f4f3a44-0dd9-49eb-bbb8-ada255b278de-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-v6t2f\" (UID: \"2f4f3a44-0dd9-49eb-bbb8-ada255b278de\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v6t2f" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.211451 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.215492 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.211509 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.211525 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.211579 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.211601 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.211633 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.211781 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.211850 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.211889 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.211923 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.211982 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.212016 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.212097 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.212211 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.212247 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.212339 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.212388 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.221714 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-q46rz"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.222469 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6k2g8"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.222918 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.223177 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q46rz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.223218 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.226740 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a8b8b00-f9df-45f1-97af-4d88c02a4d98-serving-cert\") pod \"authentication-operator-69f744f599-v8xf8\" (UID: \"8a8b8b00-f9df-45f1-97af-4d88c02a4d98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8xf8" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.224355 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.224911 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2f4f3a44-0dd9-49eb-bbb8-ada255b278de-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-v6t2f\" (UID: \"2f4f3a44-0dd9-49eb-bbb8-ada255b278de\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v6t2f" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.225073 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dc5q\" (UniqueName: \"kubernetes.io/projected/0593410f-5966-4c13-9978-dbb0dee5faab-kube-api-access-5dc5q\") pod \"cluster-samples-operator-665b6dd947-fw7rw\" (UID: \"0593410f-5966-4c13-9978-dbb0dee5faab\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7rw" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.225718 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.225783 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.227036 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.226510 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.226602 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.227150 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.227248 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.223885 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/88e84359-a2f8-4d55-96e4-fda2ff226372-etcd-client\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.227392 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.227411 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.227572 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.227747 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.228103 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.227486 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.230483 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xkvs"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.230838 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.230881 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.231366 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-bbjwp"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.231538 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xkvs" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.232035 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-82b8f"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.232402 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ttbrn"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.232879 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-bbjwp" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.233111 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ttbrn" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.233814 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-82b8f" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.234889 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6nrg\" (UniqueName: \"kubernetes.io/projected/8a8b8b00-f9df-45f1-97af-4d88c02a4d98-kube-api-access-x6nrg\") pod \"authentication-operator-69f744f599-v8xf8\" (UID: \"8a8b8b00-f9df-45f1-97af-4d88c02a4d98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8xf8" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.234907 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s664q\" (UniqueName: \"kubernetes.io/projected/88e84359-a2f8-4d55-96e4-fda2ff226372-kube-api-access-s664q\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.234953 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5sp6x"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.235623 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-5sp6x" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.236673 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.238261 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.260247 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.260497 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snrpl\" (UniqueName: \"kubernetes.io/projected/2f4f3a44-0dd9-49eb-bbb8-ada255b278de-kube-api-access-snrpl\") pod \"cluster-image-registry-operator-dc59b4c8b-v6t2f\" (UID: \"2f4f3a44-0dd9-49eb-bbb8-ada255b278de\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v6t2f" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.261605 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf85x\" (UniqueName: \"kubernetes.io/projected/7fc9b807-491e-4540-80ba-dd9243fa514c-kube-api-access-qf85x\") pod \"dns-operator-744455d44c-bpzqw\" (UID: \"7fc9b807-491e-4540-80ba-dd9243fa514c\") " pod="openshift-dns-operator/dns-operator-744455d44c-bpzqw" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.262058 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.262142 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88e84359-a2f8-4d55-96e4-fda2ff226372-serving-cert\") pod \"apiserver-7bbb656c7d-zcsqv\" (UID: \"88e84359-a2f8-4d55-96e4-fda2ff226372\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.262878 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kp4p\" (UniqueName: \"kubernetes.io/projected/b1f4b64e-0bcd-420e-a4d6-80918348ed75-kube-api-access-6kp4p\") pod \"machine-approver-56656f9798-62vhn\" (UID: \"b1f4b64e-0bcd-420e-a4d6-80918348ed75\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-62vhn" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.264000 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6318da7c-2891-47a5-bebf-edd3da5a103a-etcd-client\") pod \"etcd-operator-b45778765-h48zc\" (UID: \"6318da7c-2891-47a5-bebf-edd3da5a103a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.264617 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3ea797e4-54e0-4063-8d2b-647f6686e2a8-etcd-serving-ca\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.264666 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17c8a131-fc0e-44b5-b374-846e6b2aeb1c-config\") pod \"machine-api-operator-5694c8668f-spfls\" (UID: \"17c8a131-fc0e-44b5-b374-846e6b2aeb1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-spfls" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.264692 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/96df7f4c-b782-43e2-99b2-fa5219a59fd9-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tzzkm\" (UID: \"96df7f4c-b782-43e2-99b2-fa5219a59fd9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tzzkm" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.264710 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l4xc\" (UniqueName: \"kubernetes.io/projected/96df7f4c-b782-43e2-99b2-fa5219a59fd9-kube-api-access-8l4xc\") pod \"machine-config-controller-84d6567774-tzzkm\" (UID: \"96df7f4c-b782-43e2-99b2-fa5219a59fd9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tzzkm" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.264745 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3ea797e4-54e0-4063-8d2b-647f6686e2a8-etcd-client\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.264767 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfb5c679-7c23-47fe-92b2-e035dceef1be-serving-cert\") pod \"console-operator-58897d9998-mcszv\" (UID: \"bfb5c679-7c23-47fe-92b2-e035dceef1be\") " pod="openshift-console-operator/console-operator-58897d9998-mcszv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.264816 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/17c8a131-fc0e-44b5-b374-846e6b2aeb1c-images\") pod \"machine-api-operator-5694c8668f-spfls\" (UID: \"17c8a131-fc0e-44b5-b374-846e6b2aeb1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-spfls" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.264846 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3ea797e4-54e0-4063-8d2b-647f6686e2a8-audit\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.264867 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ea797e4-54e0-4063-8d2b-647f6686e2a8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.264890 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ea797e4-54e0-4063-8d2b-647f6686e2a8-serving-cert\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.264911 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3ea797e4-54e0-4063-8d2b-647f6686e2a8-encryption-config\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.264935 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bfb5c679-7c23-47fe-92b2-e035dceef1be-trusted-ca\") pod \"console-operator-58897d9998-mcszv\" (UID: \"bfb5c679-7c23-47fe-92b2-e035dceef1be\") " pod="openshift-console-operator/console-operator-58897d9998-mcszv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.264956 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vf78\" (UniqueName: \"kubernetes.io/projected/17c8a131-fc0e-44b5-b374-846e6b2aeb1c-kube-api-access-2vf78\") pod \"machine-api-operator-5694c8668f-spfls\" (UID: \"17c8a131-fc0e-44b5-b374-846e6b2aeb1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-spfls" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.264991 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km45n\" (UniqueName: \"kubernetes.io/projected/bfb5c679-7c23-47fe-92b2-e035dceef1be-kube-api-access-km45n\") pod \"console-operator-58897d9998-mcszv\" (UID: \"bfb5c679-7c23-47fe-92b2-e035dceef1be\") " pod="openshift-console-operator/console-operator-58897d9998-mcszv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.265015 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3ea797e4-54e0-4063-8d2b-647f6686e2a8-audit-dir\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.265033 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ea797e4-54e0-4063-8d2b-647f6686e2a8-config\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.265054 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfb5c679-7c23-47fe-92b2-e035dceef1be-config\") pod \"console-operator-58897d9998-mcszv\" (UID: \"bfb5c679-7c23-47fe-92b2-e035dceef1be\") " pod="openshift-console-operator/console-operator-58897d9998-mcszv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.265074 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/96df7f4c-b782-43e2-99b2-fa5219a59fd9-proxy-tls\") pod \"machine-config-controller-84d6567774-tzzkm\" (UID: \"96df7f4c-b782-43e2-99b2-fa5219a59fd9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tzzkm" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.265095 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3ea797e4-54e0-4063-8d2b-647f6686e2a8-image-import-ca\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.265141 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/17c8a131-fc0e-44b5-b374-846e6b2aeb1c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-spfls\" (UID: \"17c8a131-fc0e-44b5-b374-846e6b2aeb1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-spfls" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.265159 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3ea797e4-54e0-4063-8d2b-647f6686e2a8-node-pullsecrets\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.265195 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggj9b\" (UniqueName: \"kubernetes.io/projected/3ea797e4-54e0-4063-8d2b-647f6686e2a8-kube-api-access-ggj9b\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.265882 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3ea797e4-54e0-4063-8d2b-647f6686e2a8-etcd-serving-ca\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.266665 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq22q\" (UniqueName: \"kubernetes.io/projected/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-kube-api-access-xq22q\") pod \"console-f9d7485db-tz5vz\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.267913 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3ea797e4-54e0-4063-8d2b-647f6686e2a8-audit\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.268160 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3ea797e4-54e0-4063-8d2b-647f6686e2a8-audit-dir\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.268683 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ea797e4-54e0-4063-8d2b-647f6686e2a8-config\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.270112 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3ea797e4-54e0-4063-8d2b-647f6686e2a8-image-import-ca\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.270253 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3ea797e4-54e0-4063-8d2b-647f6686e2a8-node-pullsecrets\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.270803 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4m4g"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.271848 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4m4g" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.272021 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.272293 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.272989 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ea797e4-54e0-4063-8d2b-647f6686e2a8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.273095 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v6t2f"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.274944 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3ea797e4-54e0-4063-8d2b-647f6686e2a8-encryption-config\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.275820 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3ea797e4-54e0-4063-8d2b-647f6686e2a8-etcd-client\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.275856 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.278057 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ea797e4-54e0-4063-8d2b-647f6686e2a8-serving-cert\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.279149 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-v8xf8"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.281475 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-w4nl5"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.283389 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.287000 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.288409 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-spfls"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.289392 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9t2lx"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.290386 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-h48zc"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.291614 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7rw"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.291782 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.293088 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-mcszv"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.299794 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4shqj"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.301238 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-j8ggj"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.301455 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.302576 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gffmb"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.304276 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tzzkm"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.304922 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xkvs"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.305911 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-46w22"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.307800 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjbls"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.308111 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v6t2f" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.310985 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-tz5vz"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.312606 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.316541 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6k2g8"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.317723 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bstw9"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.319944 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-bpzqw"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.321007 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.321384 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.324278 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-td8n5"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.324898 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-td8n5" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.325225 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-vwjsv"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.329310 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.331036 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.331080 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d5pqr"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.331096 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5sp6x"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.331110 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ttbrn"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.333370 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ggf6k"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.333409 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.334355 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-q46rz"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.335361 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-td8n5"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.336649 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-d6mxf"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.337321 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-d6mxf" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.337746 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-bbjwp"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.338053 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.338762 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-ssf69"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.339549 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-ssf69" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.339774 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.340945 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.341675 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.342439 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-82b8f"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.344121 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-vwjsv"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.344743 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.345067 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-v8xf8" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.346681 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4m4g"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.351936 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-d6mxf"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.358708 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7rw" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.362317 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.366649 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17c8a131-fc0e-44b5-b374-846e6b2aeb1c-config\") pod \"machine-api-operator-5694c8668f-spfls\" (UID: \"17c8a131-fc0e-44b5-b374-846e6b2aeb1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-spfls" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.366699 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/96df7f4c-b782-43e2-99b2-fa5219a59fd9-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tzzkm\" (UID: \"96df7f4c-b782-43e2-99b2-fa5219a59fd9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tzzkm" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.366727 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l4xc\" (UniqueName: \"kubernetes.io/projected/96df7f4c-b782-43e2-99b2-fa5219a59fd9-kube-api-access-8l4xc\") pod \"machine-config-controller-84d6567774-tzzkm\" (UID: \"96df7f4c-b782-43e2-99b2-fa5219a59fd9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tzzkm" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.366762 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfb5c679-7c23-47fe-92b2-e035dceef1be-serving-cert\") pod \"console-operator-58897d9998-mcszv\" (UID: \"bfb5c679-7c23-47fe-92b2-e035dceef1be\") " pod="openshift-console-operator/console-operator-58897d9998-mcszv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.366782 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/17c8a131-fc0e-44b5-b374-846e6b2aeb1c-images\") pod \"machine-api-operator-5694c8668f-spfls\" (UID: \"17c8a131-fc0e-44b5-b374-846e6b2aeb1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-spfls" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.366841 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bfb5c679-7c23-47fe-92b2-e035dceef1be-trusted-ca\") pod \"console-operator-58897d9998-mcszv\" (UID: \"bfb5c679-7c23-47fe-92b2-e035dceef1be\") " pod="openshift-console-operator/console-operator-58897d9998-mcszv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.366867 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vf78\" (UniqueName: \"kubernetes.io/projected/17c8a131-fc0e-44b5-b374-846e6b2aeb1c-kube-api-access-2vf78\") pod \"machine-api-operator-5694c8668f-spfls\" (UID: \"17c8a131-fc0e-44b5-b374-846e6b2aeb1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-spfls" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.366897 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km45n\" (UniqueName: \"kubernetes.io/projected/bfb5c679-7c23-47fe-92b2-e035dceef1be-kube-api-access-km45n\") pod \"console-operator-58897d9998-mcszv\" (UID: \"bfb5c679-7c23-47fe-92b2-e035dceef1be\") " pod="openshift-console-operator/console-operator-58897d9998-mcszv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.366919 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfb5c679-7c23-47fe-92b2-e035dceef1be-config\") pod \"console-operator-58897d9998-mcszv\" (UID: \"bfb5c679-7c23-47fe-92b2-e035dceef1be\") " pod="openshift-console-operator/console-operator-58897d9998-mcszv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.366938 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/96df7f4c-b782-43e2-99b2-fa5219a59fd9-proxy-tls\") pod \"machine-config-controller-84d6567774-tzzkm\" (UID: \"96df7f4c-b782-43e2-99b2-fa5219a59fd9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tzzkm" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.366996 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/17c8a131-fc0e-44b5-b374-846e6b2aeb1c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-spfls\" (UID: \"17c8a131-fc0e-44b5-b374-846e6b2aeb1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-spfls" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.368600 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/17c8a131-fc0e-44b5-b374-846e6b2aeb1c-images\") pod \"machine-api-operator-5694c8668f-spfls\" (UID: \"17c8a131-fc0e-44b5-b374-846e6b2aeb1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-spfls" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.368948 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/96df7f4c-b782-43e2-99b2-fa5219a59fd9-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tzzkm\" (UID: \"96df7f4c-b782-43e2-99b2-fa5219a59fd9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tzzkm" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.369313 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17c8a131-fc0e-44b5-b374-846e6b2aeb1c-config\") pod \"machine-api-operator-5694c8668f-spfls\" (UID: \"17c8a131-fc0e-44b5-b374-846e6b2aeb1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-spfls" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.370038 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bfb5c679-7c23-47fe-92b2-e035dceef1be-trusted-ca\") pod \"console-operator-58897d9998-mcszv\" (UID: \"bfb5c679-7c23-47fe-92b2-e035dceef1be\") " pod="openshift-console-operator/console-operator-58897d9998-mcszv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.370423 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/17c8a131-fc0e-44b5-b374-846e6b2aeb1c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-spfls\" (UID: \"17c8a131-fc0e-44b5-b374-846e6b2aeb1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-spfls" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.370659 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfb5c679-7c23-47fe-92b2-e035dceef1be-config\") pod \"console-operator-58897d9998-mcszv\" (UID: \"bfb5c679-7c23-47fe-92b2-e035dceef1be\") " pod="openshift-console-operator/console-operator-58897d9998-mcszv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.374373 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfb5c679-7c23-47fe-92b2-e035dceef1be-serving-cert\") pod \"console-operator-58897d9998-mcszv\" (UID: \"bfb5c679-7c23-47fe-92b2-e035dceef1be\") " pod="openshift-console-operator/console-operator-58897d9998-mcszv" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.380850 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-62vhn" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.381127 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-bpzqw" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.402492 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.404640 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.423090 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: W0217 13:27:47.438124 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1f4b64e_0bcd_420e_a4d6_80918348ed75.slice/crio-3ee07fa6ba6d36720c035c7ba665fa7a060fc106c811bb69d327f7cc277caece WatchSource:0}: Error finding container 3ee07fa6ba6d36720c035c7ba665fa7a060fc106c811bb69d327f7cc277caece: Status 404 returned error can't find the container with id 3ee07fa6ba6d36720c035c7ba665fa7a060fc106c811bb69d327f7cc277caece Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.442984 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.461989 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.481955 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.518098 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.521566 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.583700 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmjld\" (UniqueName: \"kubernetes.io/projected/6318da7c-2891-47a5-bebf-edd3da5a103a-kube-api-access-cmjld\") pod \"etcd-operator-b45778765-h48zc\" (UID: \"6318da7c-2891-47a5-bebf-edd3da5a103a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.585770 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.586280 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-879pl\" (UniqueName: \"kubernetes.io/projected/4c36b00a-bd3f-424c-a67b-d828d782e60f-kube-api-access-879pl\") pod \"downloads-7954f5f757-w4nl5\" (UID: \"4c36b00a-bd3f-424c-a67b-d828d782e60f\") " pod="openshift-console/downloads-7954f5f757-w4nl5" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.587953 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv"] Feb 17 13:27:47 crc kubenswrapper[4804]: W0217 13:27:47.597971 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88e84359_a2f8_4d55_96e4_fda2ff226372.slice/crio-c9b9fab89f70bb50fd53e1da34a1ce026563ddf2c39c88b3643b42f5be6e5961 WatchSource:0}: Error finding container c9b9fab89f70bb50fd53e1da34a1ce026563ddf2c39c88b3643b42f5be6e5961: Status 404 returned error can't find the container with id c9b9fab89f70bb50fd53e1da34a1ce026563ddf2c39c88b3643b42f5be6e5961 Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.602651 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.621910 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.630986 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-w4nl5" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.641715 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.683801 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.692887 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.701896 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.740148 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-bpzqw"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.751092 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.763006 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 17 13:27:47 crc kubenswrapper[4804]: W0217 13:27:47.764261 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fc9b807_491e_4540_80ba_dd9243fa514c.slice/crio-a742f2a13e217d53ca3eb2ba15cfeaa06ae352d164ec190b1277029399e33daf WatchSource:0}: Error finding container a742f2a13e217d53ca3eb2ba15cfeaa06ae352d164ec190b1277029399e33daf: Status 404 returned error can't find the container with id a742f2a13e217d53ca3eb2ba15cfeaa06ae352d164ec190b1277029399e33daf Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.782954 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.807300 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-w4nl5"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.809814 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 17 13:27:47 crc kubenswrapper[4804]: W0217 13:27:47.820406 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c36b00a_bd3f_424c_a67b_d828d782e60f.slice/crio-05aa1eba7052796e17cc8945ae7ffffe0694d784084a4b854dd8965a2ae59169 WatchSource:0}: Error finding container 05aa1eba7052796e17cc8945ae7ffffe0694d784084a4b854dd8965a2ae59169: Status 404 returned error can't find the container with id 05aa1eba7052796e17cc8945ae7ffffe0694d784084a4b854dd8965a2ae59169 Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.822090 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.840751 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-bpzqw" event={"ID":"7fc9b807-491e-4540-80ba-dd9243fa514c","Type":"ContainerStarted","Data":"a742f2a13e217d53ca3eb2ba15cfeaa06ae352d164ec190b1277029399e33daf"} Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.841936 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-w4nl5" event={"ID":"4c36b00a-bd3f-424c-a67b-d828d782e60f","Type":"ContainerStarted","Data":"05aa1eba7052796e17cc8945ae7ffffe0694d784084a4b854dd8965a2ae59169"} Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.843473 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" event={"ID":"88e84359-a2f8-4d55-96e4-fda2ff226372","Type":"ContainerStarted","Data":"c9b9fab89f70bb50fd53e1da34a1ce026563ddf2c39c88b3643b42f5be6e5961"} Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.843710 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.846388 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-62vhn" event={"ID":"b1f4b64e-0bcd-420e-a4d6-80918348ed75","Type":"ContainerStarted","Data":"41bb1edb1f0afe3c11b4dbb133726a71c2b34b6e410a4e525ffe2969562e7d2e"} Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.846439 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-62vhn" event={"ID":"b1f4b64e-0bcd-420e-a4d6-80918348ed75","Type":"ContainerStarted","Data":"3ee07fa6ba6d36720c035c7ba665fa7a060fc106c811bb69d327f7cc277caece"} Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.852312 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-tz5vz"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.862655 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 17 13:27:47 crc kubenswrapper[4804]: W0217 13:27:47.867862 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9eb6b4b9_9e2e_4f39_9df0_068cfea71701.slice/crio-981cd8ca6939145b19efcac42c0b745084dc50ef247139e74f5af40d78e085ba WatchSource:0}: Error finding container 981cd8ca6939145b19efcac42c0b745084dc50ef247139e74f5af40d78e085ba: Status 404 returned error can't find the container with id 981cd8ca6939145b19efcac42c0b745084dc50ef247139e74f5af40d78e085ba Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.869613 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v6t2f"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.882106 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.900236 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-h48zc"] Feb 17 13:27:47 crc kubenswrapper[4804]: W0217 13:27:47.900930 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f4f3a44_0dd9_49eb_bbb8_ada255b278de.slice/crio-e9fafcd585fc94a3539aa0f9f68c6276c65ebc758dd93035fbffb61e4de93d6e WatchSource:0}: Error finding container e9fafcd585fc94a3539aa0f9f68c6276c65ebc758dd93035fbffb61e4de93d6e: Status 404 returned error can't find the container with id e9fafcd585fc94a3539aa0f9f68c6276c65ebc758dd93035fbffb61e4de93d6e Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.902097 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.910695 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-v8xf8"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.922954 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.944503 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.944831 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7rw"] Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.961542 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 17 13:27:47 crc kubenswrapper[4804]: W0217 13:27:47.976987 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a8b8b00_f9df_45f1_97af_4d88c02a4d98.slice/crio-18cccd51d77f45d9868a27c5b0569007979b1cdc38e98ebbbfc9ffa84164953e WatchSource:0}: Error finding container 18cccd51d77f45d9868a27c5b0569007979b1cdc38e98ebbbfc9ffa84164953e: Status 404 returned error can't find the container with id 18cccd51d77f45d9868a27c5b0569007979b1cdc38e98ebbbfc9ffa84164953e Feb 17 13:27:47 crc kubenswrapper[4804]: W0217 13:27:47.977312 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6318da7c_2891_47a5_bebf_edd3da5a103a.slice/crio-48e546de21858befead80836e9b675ee2762bcd85ecd8cb4fd55e46f87d16593 WatchSource:0}: Error finding container 48e546de21858befead80836e9b675ee2762bcd85ecd8cb4fd55e46f87d16593: Status 404 returned error can't find the container with id 48e546de21858befead80836e9b675ee2762bcd85ecd8cb4fd55e46f87d16593 Feb 17 13:27:47 crc kubenswrapper[4804]: I0217 13:27:47.989207 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.002435 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.028695 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.042321 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.062590 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.082440 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.102263 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.127577 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.142604 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.162065 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.182110 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.202759 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.220458 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/96df7f4c-b782-43e2-99b2-fa5219a59fd9-proxy-tls\") pod \"machine-config-controller-84d6567774-tzzkm\" (UID: \"96df7f4c-b782-43e2-99b2-fa5219a59fd9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tzzkm" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.222150 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.240289 4804 request.go:700] Waited for 1.014622436s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/secrets?fieldSelector=metadata.name%3Dmarketplace-operator-dockercfg-5nsgg&limit=500&resourceVersion=0 Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.242319 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.262330 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.288388 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.301705 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.322136 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.342455 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.362547 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.381872 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.403208 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.421932 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.441953 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.462374 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.482768 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.502343 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.522042 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.542757 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.562738 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.583812 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.603167 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.622757 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.642232 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.661526 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.682825 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.702428 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.722306 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.742235 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.772628 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.782632 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.801969 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.822814 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.841783 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.851479 4804 generic.go:334] "Generic (PLEG): container finished" podID="88e84359-a2f8-4d55-96e4-fda2ff226372" containerID="db840a682474c367b56e255ffdf3eb9d92165aa180bac672885d9c489c386a53" exitCode=0 Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.851606 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" event={"ID":"88e84359-a2f8-4d55-96e4-fda2ff226372","Type":"ContainerDied","Data":"db840a682474c367b56e255ffdf3eb9d92165aa180bac672885d9c489c386a53"} Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.863251 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.883434 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.884921 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-v8xf8" event={"ID":"8a8b8b00-f9df-45f1-97af-4d88c02a4d98","Type":"ContainerStarted","Data":"aae7fce8c275b283de609a3216bd0f9365d33571250507615ba1a3458f02bd7f"} Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.885158 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-v8xf8" event={"ID":"8a8b8b00-f9df-45f1-97af-4d88c02a4d98","Type":"ContainerStarted","Data":"18cccd51d77f45d9868a27c5b0569007979b1cdc38e98ebbbfc9ffa84164953e"} Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.886972 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" event={"ID":"6318da7c-2891-47a5-bebf-edd3da5a103a","Type":"ContainerStarted","Data":"7f15b20168e423fd201d1eeced63644c8d561aaf7c39309ce39a2fd78e85e8c6"} Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.887039 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" event={"ID":"6318da7c-2891-47a5-bebf-edd3da5a103a","Type":"ContainerStarted","Data":"48e546de21858befead80836e9b675ee2762bcd85ecd8cb4fd55e46f87d16593"} Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.889453 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-62vhn" event={"ID":"b1f4b64e-0bcd-420e-a4d6-80918348ed75","Type":"ContainerStarted","Data":"82df64a60b5bf1a6b580ab38323b9d7b57eaf3e7ae18fd799114f6c9f29f9199"} Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.891447 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-tz5vz" event={"ID":"9eb6b4b9-9e2e-4f39-9df0-068cfea71701","Type":"ContainerStarted","Data":"f15f6246225d4108f70918cfaf0b42c868ccbbfd666ad08822316ad0295da5d4"} Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.891482 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-tz5vz" event={"ID":"9eb6b4b9-9e2e-4f39-9df0-068cfea71701","Type":"ContainerStarted","Data":"981cd8ca6939145b19efcac42c0b745084dc50ef247139e74f5af40d78e085ba"} Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.893569 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-w4nl5" event={"ID":"4c36b00a-bd3f-424c-a67b-d828d782e60f","Type":"ContainerStarted","Data":"70c98ed9b17459ecce8c557560b6c3b5a2d9498e3d4a6a2641164154a5d9ebcd"} Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.894046 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-w4nl5" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.895694 4804 patch_prober.go:28] interesting pod/downloads-7954f5f757-w4nl5 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.895761 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-w4nl5" podUID="4c36b00a-bd3f-424c-a67b-d828d782e60f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.896415 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7rw" event={"ID":"0593410f-5966-4c13-9978-dbb0dee5faab","Type":"ContainerStarted","Data":"f6954b1b50d4c56f25c40c74efaa930fbef86c99f3816505b436f7f793732ff3"} Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.896466 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7rw" event={"ID":"0593410f-5966-4c13-9978-dbb0dee5faab","Type":"ContainerStarted","Data":"89e9f8509daca8e8da2fceec2fdf5d9553ef99d017fe202811ed00db2b2a43f0"} Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.896490 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7rw" event={"ID":"0593410f-5966-4c13-9978-dbb0dee5faab","Type":"ContainerStarted","Data":"c4cb842047cae58db9bab53fd5252ce21137be446d1b7b340d70b2900c7b8f95"} Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.899065 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-bpzqw" event={"ID":"7fc9b807-491e-4540-80ba-dd9243fa514c","Type":"ContainerStarted","Data":"fd0a18e263b6382f0a6ad77c2670c4636aeee3f39b9237950274312c8d97f3bc"} Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.899109 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-bpzqw" event={"ID":"7fc9b807-491e-4540-80ba-dd9243fa514c","Type":"ContainerStarted","Data":"2a80dc7e230ee0a74471b1e6d89c5f07eec8b9a58a7070d834ff59c4154c186b"} Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.901898 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.902065 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v6t2f" event={"ID":"2f4f3a44-0dd9-49eb-bbb8-ada255b278de","Type":"ContainerStarted","Data":"1a91717e61ed2e777aee1447674239b7a349199d2d59382e934466af8cf2fdae"} Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.902111 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v6t2f" event={"ID":"2f4f3a44-0dd9-49eb-bbb8-ada255b278de","Type":"ContainerStarted","Data":"e9fafcd585fc94a3539aa0f9f68c6276c65ebc758dd93035fbffb61e4de93d6e"} Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.942098 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.943720 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggj9b\" (UniqueName: \"kubernetes.io/projected/3ea797e4-54e0-4063-8d2b-647f6686e2a8-kube-api-access-ggj9b\") pod \"apiserver-76f77b778f-46w22\" (UID: \"3ea797e4-54e0-4063-8d2b-647f6686e2a8\") " pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.962711 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 17 13:27:48 crc kubenswrapper[4804]: I0217 13:27:48.983135 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.002417 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.022289 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.047411 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.065379 4804 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.083686 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.104838 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.121730 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.141775 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.162546 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.182523 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.202293 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.204565 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.223420 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.257332 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l4xc\" (UniqueName: \"kubernetes.io/projected/96df7f4c-b782-43e2-99b2-fa5219a59fd9-kube-api-access-8l4xc\") pod \"machine-config-controller-84d6567774-tzzkm\" (UID: \"96df7f4c-b782-43e2-99b2-fa5219a59fd9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tzzkm" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.262294 4804 request.go:700] Waited for 1.892694237s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/serviceaccounts/console-operator/token Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.281145 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km45n\" (UniqueName: \"kubernetes.io/projected/bfb5c679-7c23-47fe-92b2-e035dceef1be-kube-api-access-km45n\") pod \"console-operator-58897d9998-mcszv\" (UID: \"bfb5c679-7c23-47fe-92b2-e035dceef1be\") " pod="openshift-console-operator/console-operator-58897d9998-mcszv" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.305546 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vf78\" (UniqueName: \"kubernetes.io/projected/17c8a131-fc0e-44b5-b374-846e6b2aeb1c-kube-api-access-2vf78\") pod \"machine-api-operator-5694c8668f-spfls\" (UID: \"17c8a131-fc0e-44b5-b374-846e6b2aeb1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-spfls" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.539939 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: E0217 13:27:49.540860 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:50.0408037 +0000 UTC m=+144.152223077 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.541767 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-spfls" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.543287 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tzzkm" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.543392 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-mcszv" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.576028 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-46w22"] Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.642257 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:49 crc kubenswrapper[4804]: E0217 13:27:49.642475 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:50.142432505 +0000 UTC m=+144.253851842 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.642526 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/074c752f-fec1-4bd6-8773-596461ea288a-default-certificate\") pod \"router-default-5444994796-kbpk6\" (UID: \"074c752f-fec1-4bd6-8773-596461ea288a\") " pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.642575 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.642599 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9400eb64-255c-46c2-b6c6-39260e013e92-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-vjbls\" (UID: \"9400eb64-255c-46c2-b6c6-39260e013e92\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjbls" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.642654 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x554\" (UniqueName: \"kubernetes.io/projected/1d929eaa-807c-4809-8b8a-78c186418e71-kube-api-access-8x554\") pod \"controller-manager-879f6c89f-j8ggj\" (UID: \"1d929eaa-807c-4809-8b8a-78c186418e71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.642705 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/faba1ad1-aeda-412d-9824-36cc045bab86-config\") pod \"route-controller-manager-6576b87f9c-mqkcq\" (UID: \"faba1ad1-aeda-412d-9824-36cc045bab86\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.642726 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d929eaa-807c-4809-8b8a-78c186418e71-client-ca\") pod \"controller-manager-879f6c89f-j8ggj\" (UID: \"1d929eaa-807c-4809-8b8a-78c186418e71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.642756 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9400eb64-255c-46c2-b6c6-39260e013e92-config\") pod \"kube-apiserver-operator-766d6c64bb-vjbls\" (UID: \"9400eb64-255c-46c2-b6c6-39260e013e92\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjbls" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.646575 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.646649 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/70a41b60-6ec1-491d-9d3e-88758d91c45e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-b8qc5\" (UID: \"70a41b60-6ec1-491d-9d3e-88758d91c45e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.646679 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b09fea83-e0d3-4a40-b186-8432c3fa7be0-registry-certificates\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.646709 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgt6h\" (UniqueName: \"kubernetes.io/projected/074c752f-fec1-4bd6-8773-596461ea288a-kube-api-access-rgt6h\") pod \"router-default-5444994796-kbpk6\" (UID: \"074c752f-fec1-4bd6-8773-596461ea288a\") " pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.646732 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: E0217 13:27:49.646856 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:50.146831702 +0000 UTC m=+144.258251189 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.647276 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70a41b60-6ec1-491d-9d3e-88758d91c45e-serving-cert\") pod \"openshift-config-operator-7777fb866f-b8qc5\" (UID: \"70a41b60-6ec1-491d-9d3e-88758d91c45e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.647327 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp6h2\" (UniqueName: \"kubernetes.io/projected/bd4df830-6ec9-4f4d-860e-f97af3088371-kube-api-access-pp6h2\") pod \"ingress-operator-5b745b69d9-w4gnh\" (UID: \"bd4df830-6ec9-4f4d-860e-f97af3088371\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.647354 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78dad77c-6d3f-43bc-93a3-ecd7dce378f3-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-gffmb\" (UID: \"78dad77c-6d3f-43bc-93a3-ecd7dce378f3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gffmb" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.647378 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/faba1ad1-aeda-412d-9824-36cc045bab86-serving-cert\") pod \"route-controller-manager-6576b87f9c-mqkcq\" (UID: \"faba1ad1-aeda-412d-9824-36cc045bab86\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.647403 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfbln\" (UniqueName: \"kubernetes.io/projected/81f879fe-7bd1-42d0-b026-80f901641a0b-kube-api-access-vfbln\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.647442 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b09fea83-e0d3-4a40-b186-8432c3fa7be0-trusted-ca\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.647464 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78dad77c-6d3f-43bc-93a3-ecd7dce378f3-config\") pod \"openshift-apiserver-operator-796bbdcf4f-gffmb\" (UID: \"78dad77c-6d3f-43bc-93a3-ecd7dce378f3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gffmb" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.649374 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cde5d02-8e0d-4b24-b7bc-b9365013d942-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-d5pqr\" (UID: \"7cde5d02-8e0d-4b24-b7bc-b9365013d942\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d5pqr" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.649419 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwgww\" (UniqueName: \"kubernetes.io/projected/6f8789cf-f788-4c81-9624-532aa823de1c-kube-api-access-cwgww\") pod \"openshift-controller-manager-operator-756b6f6bc6-4shqj\" (UID: \"6f8789cf-f788-4c81-9624-532aa823de1c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4shqj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.649460 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d929eaa-807c-4809-8b8a-78c186418e71-serving-cert\") pod \"controller-manager-879f6c89f-j8ggj\" (UID: \"1d929eaa-807c-4809-8b8a-78c186418e71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.649486 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/81f879fe-7bd1-42d0-b026-80f901641a0b-audit-dir\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.649509 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.649582 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c3cd53a-4a82-449d-a270-b41853fa2c8a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-9t2lx\" (UID: \"4c3cd53a-4a82-449d-a270-b41853fa2c8a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9t2lx" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.649608 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.649648 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9400eb64-255c-46c2-b6c6-39260e013e92-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-vjbls\" (UID: \"9400eb64-255c-46c2-b6c6-39260e013e92\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjbls" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.649670 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/074c752f-fec1-4bd6-8773-596461ea288a-stats-auth\") pod \"router-default-5444994796-kbpk6\" (UID: \"074c752f-fec1-4bd6-8773-596461ea288a\") " pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.649691 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd4df830-6ec9-4f4d-860e-f97af3088371-trusted-ca\") pod \"ingress-operator-5b745b69d9-w4gnh\" (UID: \"bd4df830-6ec9-4f4d-860e-f97af3088371\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.649713 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bd4df830-6ec9-4f4d-860e-f97af3088371-bound-sa-token\") pod \"ingress-operator-5b745b69d9-w4gnh\" (UID: \"bd4df830-6ec9-4f4d-860e-f97af3088371\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.649737 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c3cd53a-4a82-449d-a270-b41853fa2c8a-config\") pod \"kube-controller-manager-operator-78b949d7b-9t2lx\" (UID: \"4c3cd53a-4a82-449d-a270-b41853fa2c8a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9t2lx" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.649777 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4c3cd53a-4a82-449d-a270-b41853fa2c8a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-9t2lx\" (UID: \"4c3cd53a-4a82-449d-a270-b41853fa2c8a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9t2lx" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.649797 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d929eaa-807c-4809-8b8a-78c186418e71-config\") pod \"controller-manager-879f6c89f-j8ggj\" (UID: \"1d929eaa-807c-4809-8b8a-78c186418e71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.649840 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f8789cf-f788-4c81-9624-532aa823de1c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-4shqj\" (UID: \"6f8789cf-f788-4c81-9624-532aa823de1c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4shqj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.649861 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.649882 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7dtm\" (UniqueName: \"kubernetes.io/projected/78dad77c-6d3f-43bc-93a3-ecd7dce378f3-kube-api-access-k7dtm\") pod \"openshift-apiserver-operator-796bbdcf4f-gffmb\" (UID: \"78dad77c-6d3f-43bc-93a3-ecd7dce378f3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gffmb" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.649903 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.650146 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b09fea83-e0d3-4a40-b186-8432c3fa7be0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.650227 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b09fea83-e0d3-4a40-b186-8432c3fa7be0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.650251 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bd4df830-6ec9-4f4d-860e-f97af3088371-metrics-tls\") pod \"ingress-operator-5b745b69d9-w4gnh\" (UID: \"bd4df830-6ec9-4f4d-860e-f97af3088371\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.650274 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/074c752f-fec1-4bd6-8773-596461ea288a-service-ca-bundle\") pod \"router-default-5444994796-kbpk6\" (UID: \"074c752f-fec1-4bd6-8773-596461ea288a\") " pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.650378 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.650399 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b09fea83-e0d3-4a40-b186-8432c3fa7be0-registry-tls\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.650441 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-audit-policies\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.650467 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqx9v\" (UniqueName: \"kubernetes.io/projected/b09fea83-e0d3-4a40-b186-8432c3fa7be0-kube-api-access-cqx9v\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.650525 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wz5b\" (UniqueName: \"kubernetes.io/projected/faba1ad1-aeda-412d-9824-36cc045bab86-kube-api-access-2wz5b\") pod \"route-controller-manager-6576b87f9c-mqkcq\" (UID: \"faba1ad1-aeda-412d-9824-36cc045bab86\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.650548 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.650587 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/074c752f-fec1-4bd6-8773-596461ea288a-metrics-certs\") pod \"router-default-5444994796-kbpk6\" (UID: \"074c752f-fec1-4bd6-8773-596461ea288a\") " pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.650612 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7cde5d02-8e0d-4b24-b7bc-b9365013d942-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-d5pqr\" (UID: \"7cde5d02-8e0d-4b24-b7bc-b9365013d942\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d5pqr" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.650632 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f8789cf-f788-4c81-9624-532aa823de1c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-4shqj\" (UID: \"6f8789cf-f788-4c81-9624-532aa823de1c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4shqj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.650656 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n95dt\" (UniqueName: \"kubernetes.io/projected/70a41b60-6ec1-491d-9d3e-88758d91c45e-kube-api-access-n95dt\") pod \"openshift-config-operator-7777fb866f-b8qc5\" (UID: \"70a41b60-6ec1-491d-9d3e-88758d91c45e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.650697 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.650738 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/faba1ad1-aeda-412d-9824-36cc045bab86-client-ca\") pod \"route-controller-manager-6576b87f9c-mqkcq\" (UID: \"faba1ad1-aeda-412d-9824-36cc045bab86\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.651549 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.651604 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cde5d02-8e0d-4b24-b7bc-b9365013d942-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-d5pqr\" (UID: \"7cde5d02-8e0d-4b24-b7bc-b9365013d942\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d5pqr" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.651624 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1d929eaa-807c-4809-8b8a-78c186418e71-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-j8ggj\" (UID: \"1d929eaa-807c-4809-8b8a-78c186418e71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.651659 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.651684 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b09fea83-e0d3-4a40-b186-8432c3fa7be0-bound-sa-token\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.752670 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:49 crc kubenswrapper[4804]: E0217 13:27:49.752758 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:50.252740519 +0000 UTC m=+144.364159856 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753348 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwgww\" (UniqueName: \"kubernetes.io/projected/6f8789cf-f788-4c81-9624-532aa823de1c-kube-api-access-cwgww\") pod \"openshift-controller-manager-operator-756b6f6bc6-4shqj\" (UID: \"6f8789cf-f788-4c81-9624-532aa823de1c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4shqj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753386 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3768c453-c58d-4768-9620-a202cbb8ccd8-secret-volume\") pod \"collect-profiles-29522235-lnbg8\" (UID: \"3768c453-c58d-4768-9620-a202cbb8ccd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753456 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d929eaa-807c-4809-8b8a-78c186418e71-serving-cert\") pod \"controller-manager-879f6c89f-j8ggj\" (UID: \"1d929eaa-807c-4809-8b8a-78c186418e71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753481 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h988v\" (UniqueName: \"kubernetes.io/projected/c36c8731-9ee6-4ce6-8708-9e35e6112804-kube-api-access-h988v\") pod \"dns-default-d6mxf\" (UID: \"c36c8731-9ee6-4ce6-8708-9e35e6112804\") " pod="openshift-dns/dns-default-d6mxf" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753502 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fe7f1d16-c21b-4de3-9f7b-d8bfe8f026c6-signing-cabundle\") pod \"service-ca-9c57cc56f-5sp6x\" (UID: \"fe7f1d16-c21b-4de3-9f7b-d8bfe8f026c6\") " pod="openshift-service-ca/service-ca-9c57cc56f-5sp6x" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753539 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/81f879fe-7bd1-42d0-b026-80f901641a0b-audit-dir\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753560 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753584 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-228st\" (UniqueName: \"kubernetes.io/projected/527ee9be-17be-4352-86fc-ef31bece3e86-kube-api-access-228st\") pod \"multus-admission-controller-857f4d67dd-bbjwp\" (UID: \"527ee9be-17be-4352-86fc-ef31bece3e86\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bbjwp" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753611 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c3cd53a-4a82-449d-a270-b41853fa2c8a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-9t2lx\" (UID: \"4c3cd53a-4a82-449d-a270-b41853fa2c8a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9t2lx" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753644 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753670 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c98dfab-f166-4eb4-b385-724d6b9b9d7a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-t4m4g\" (UID: \"6c98dfab-f166-4eb4-b385-724d6b9b9d7a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4m4g" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753706 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c36c8731-9ee6-4ce6-8708-9e35e6112804-metrics-tls\") pod \"dns-default-d6mxf\" (UID: \"c36c8731-9ee6-4ce6-8708-9e35e6112804\") " pod="openshift-dns/dns-default-d6mxf" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753731 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9400eb64-255c-46c2-b6c6-39260e013e92-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-vjbls\" (UID: \"9400eb64-255c-46c2-b6c6-39260e013e92\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjbls" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753754 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/074c752f-fec1-4bd6-8773-596461ea288a-stats-auth\") pod \"router-default-5444994796-kbpk6\" (UID: \"074c752f-fec1-4bd6-8773-596461ea288a\") " pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753778 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd4df830-6ec9-4f4d-860e-f97af3088371-trusted-ca\") pod \"ingress-operator-5b745b69d9-w4gnh\" (UID: \"bd4df830-6ec9-4f4d-860e-f97af3088371\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753801 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bd4df830-6ec9-4f4d-860e-f97af3088371-bound-sa-token\") pod \"ingress-operator-5b745b69d9-w4gnh\" (UID: \"bd4df830-6ec9-4f4d-860e-f97af3088371\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753823 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c3cd53a-4a82-449d-a270-b41853fa2c8a-config\") pod \"kube-controller-manager-operator-78b949d7b-9t2lx\" (UID: \"4c3cd53a-4a82-449d-a270-b41853fa2c8a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9t2lx" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753844 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4c3cd53a-4a82-449d-a270-b41853fa2c8a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-9t2lx\" (UID: \"4c3cd53a-4a82-449d-a270-b41853fa2c8a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9t2lx" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753868 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d929eaa-807c-4809-8b8a-78c186418e71-config\") pod \"controller-manager-879f6c89f-j8ggj\" (UID: \"1d929eaa-807c-4809-8b8a-78c186418e71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753893 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4gxw\" (UniqueName: \"kubernetes.io/projected/2ce6eded-da13-4bb7-a87d-71b87d0e7f06-kube-api-access-x4gxw\") pod \"marketplace-operator-79b997595-6k2g8\" (UID: \"2ce6eded-da13-4bb7-a87d-71b87d0e7f06\") " pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753920 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fd233b99-2205-4e95-ba04-232015517afb-profile-collector-cert\") pod \"catalog-operator-68c6474976-645bx\" (UID: \"fd233b99-2205-4e95-ba04-232015517afb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753943 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f8789cf-f788-4c81-9624-532aa823de1c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-4shqj\" (UID: \"6f8789cf-f788-4c81-9624-532aa823de1c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4shqj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753964 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.753987 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7dtm\" (UniqueName: \"kubernetes.io/projected/78dad77c-6d3f-43bc-93a3-ecd7dce378f3-kube-api-access-k7dtm\") pod \"openshift-apiserver-operator-796bbdcf4f-gffmb\" (UID: \"78dad77c-6d3f-43bc-93a3-ecd7dce378f3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gffmb" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754009 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a5192d8-6708-48c6-b5e5-a081f89d3e66-config\") pod \"service-ca-operator-777779d784-ttbrn\" (UID: \"4a5192d8-6708-48c6-b5e5-a081f89d3e66\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ttbrn" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754031 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754067 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1975682c-3445-467d-a0bd-a87b0ebf604b-certs\") pod \"machine-config-server-ssf69\" (UID: \"1975682c-3445-467d-a0bd-a87b0ebf604b\") " pod="openshift-machine-config-operator/machine-config-server-ssf69" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754383 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fd882ba0-9d9f-4a38-8a48-ab4d146fff56-registration-dir\") pod \"csi-hostpathplugin-vwjsv\" (UID: \"fd882ba0-9d9f-4a38-8a48-ab4d146fff56\") " pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754406 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fe7f1d16-c21b-4de3-9f7b-d8bfe8f026c6-signing-key\") pod \"service-ca-9c57cc56f-5sp6x\" (UID: \"fe7f1d16-c21b-4de3-9f7b-d8bfe8f026c6\") " pod="openshift-service-ca/service-ca-9c57cc56f-5sp6x" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754447 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dzxb\" (UniqueName: \"kubernetes.io/projected/1975682c-3445-467d-a0bd-a87b0ebf604b-kube-api-access-4dzxb\") pod \"machine-config-server-ssf69\" (UID: \"1975682c-3445-467d-a0bd-a87b0ebf604b\") " pod="openshift-machine-config-operator/machine-config-server-ssf69" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754471 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/360a1093-b581-4806-9f88-3d3907bd4895-apiservice-cert\") pod \"packageserver-d55dfcdfc-9d9jq\" (UID: \"360a1093-b581-4806-9f88-3d3907bd4895\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754496 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b09fea83-e0d3-4a40-b186-8432c3fa7be0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754522 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b09fea83-e0d3-4a40-b186-8432c3fa7be0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754546 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5dmh\" (UniqueName: \"kubernetes.io/projected/81a4453c-e1e8-4624-a19b-f08ec4df93d7-kube-api-access-j5dmh\") pod \"olm-operator-6b444d44fb-vmmzq\" (UID: \"81a4453c-e1e8-4624-a19b-f08ec4df93d7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754570 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z9lv\" (UniqueName: \"kubernetes.io/projected/3768c453-c58d-4768-9620-a202cbb8ccd8-kube-api-access-7z9lv\") pod \"collect-profiles-29522235-lnbg8\" (UID: \"3768c453-c58d-4768-9620-a202cbb8ccd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754594 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bd4df830-6ec9-4f4d-860e-f97af3088371-metrics-tls\") pod \"ingress-operator-5b745b69d9-w4gnh\" (UID: \"bd4df830-6ec9-4f4d-860e-f97af3088371\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754630 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/074c752f-fec1-4bd6-8773-596461ea288a-service-ca-bundle\") pod \"router-default-5444994796-kbpk6\" (UID: \"074c752f-fec1-4bd6-8773-596461ea288a\") " pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754655 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/81a4453c-e1e8-4624-a19b-f08ec4df93d7-srv-cert\") pod \"olm-operator-6b444d44fb-vmmzq\" (UID: \"81a4453c-e1e8-4624-a19b-f08ec4df93d7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754683 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754711 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b09fea83-e0d3-4a40-b186-8432c3fa7be0-registry-tls\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754735 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spwnl\" (UniqueName: \"kubernetes.io/projected/360a1093-b581-4806-9f88-3d3907bd4895-kube-api-access-spwnl\") pod \"packageserver-d55dfcdfc-9d9jq\" (UID: \"360a1093-b581-4806-9f88-3d3907bd4895\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754757 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e609565-a380-48f1-9b14-542a17c4ea50-cert\") pod \"ingress-canary-td8n5\" (UID: \"8e609565-a380-48f1-9b14-542a17c4ea50\") " pod="openshift-ingress-canary/ingress-canary-td8n5" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754779 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6882836-eb39-412c-a0d6-4906c9be9b89-proxy-tls\") pod \"machine-config-operator-74547568cd-trrpm\" (UID: \"a6882836-eb39-412c-a0d6-4906c9be9b89\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754813 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2ce6eded-da13-4bb7-a87d-71b87d0e7f06-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6k2g8\" (UID: \"2ce6eded-da13-4bb7-a87d-71b87d0e7f06\") " pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.754836 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28e27ee8-4574-4731-9324-031f9b3a209f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-82b8f\" (UID: \"28e27ee8-4574-4731-9324-031f9b3a209f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-82b8f" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.755084 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a6882836-eb39-412c-a0d6-4906c9be9b89-auth-proxy-config\") pod \"machine-config-operator-74547568cd-trrpm\" (UID: \"a6882836-eb39-412c-a0d6-4906c9be9b89\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.757644 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/074c752f-fec1-4bd6-8773-596461ea288a-service-ca-bundle\") pod \"router-default-5444994796-kbpk6\" (UID: \"074c752f-fec1-4bd6-8773-596461ea288a\") " pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.758606 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f8789cf-f788-4c81-9624-532aa823de1c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-4shqj\" (UID: \"6f8789cf-f788-4c81-9624-532aa823de1c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4shqj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.758780 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c3cd53a-4a82-449d-a270-b41853fa2c8a-config\") pod \"kube-controller-manager-operator-78b949d7b-9t2lx\" (UID: \"4c3cd53a-4a82-449d-a270-b41853fa2c8a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9t2lx" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.758967 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.759158 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b09fea83-e0d3-4a40-b186-8432c3fa7be0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.759825 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d929eaa-807c-4809-8b8a-78c186418e71-config\") pod \"controller-manager-879f6c89f-j8ggj\" (UID: \"1d929eaa-807c-4809-8b8a-78c186418e71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.759823 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/81f879fe-7bd1-42d0-b026-80f901641a0b-audit-dir\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.760474 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-audit-policies\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.760655 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd4df830-6ec9-4f4d-860e-f97af3088371-trusted-ca\") pod \"ingress-operator-5b745b69d9-w4gnh\" (UID: \"bd4df830-6ec9-4f4d-860e-f97af3088371\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.760759 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqx9v\" (UniqueName: \"kubernetes.io/projected/b09fea83-e0d3-4a40-b186-8432c3fa7be0-kube-api-access-cqx9v\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.760810 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2aaa28d2-1ca6-42c3-98f7-58c644a03061-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6xkvs\" (UID: \"2aaa28d2-1ca6-42c3-98f7-58c644a03061\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xkvs" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.760904 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wz5b\" (UniqueName: \"kubernetes.io/projected/faba1ad1-aeda-412d-9824-36cc045bab86-kube-api-access-2wz5b\") pod \"route-controller-manager-6576b87f9c-mqkcq\" (UID: \"faba1ad1-aeda-412d-9824-36cc045bab86\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.760924 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bd4df830-6ec9-4f4d-860e-f97af3088371-metrics-tls\") pod \"ingress-operator-5b745b69d9-w4gnh\" (UID: \"bd4df830-6ec9-4f4d-860e-f97af3088371\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761085 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761115 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/360a1093-b581-4806-9f88-3d3907bd4895-webhook-cert\") pod \"packageserver-d55dfcdfc-9d9jq\" (UID: \"360a1093-b581-4806-9f88-3d3907bd4895\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761141 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-audit-policies\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761238 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/074c752f-fec1-4bd6-8773-596461ea288a-metrics-certs\") pod \"router-default-5444994796-kbpk6\" (UID: \"074c752f-fec1-4bd6-8773-596461ea288a\") " pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761285 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7cde5d02-8e0d-4b24-b7bc-b9365013d942-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-d5pqr\" (UID: \"7cde5d02-8e0d-4b24-b7bc-b9365013d942\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d5pqr" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761305 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c36c8731-9ee6-4ce6-8708-9e35e6112804-config-volume\") pod \"dns-default-d6mxf\" (UID: \"c36c8731-9ee6-4ce6-8708-9e35e6112804\") " pod="openshift-dns/dns-default-d6mxf" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761416 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f8789cf-f788-4c81-9624-532aa823de1c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-4shqj\" (UID: \"6f8789cf-f788-4c81-9624-532aa823de1c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4shqj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761436 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n95dt\" (UniqueName: \"kubernetes.io/projected/70a41b60-6ec1-491d-9d3e-88758d91c45e-kube-api-access-n95dt\") pod \"openshift-config-operator-7777fb866f-b8qc5\" (UID: \"70a41b60-6ec1-491d-9d3e-88758d91c45e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761453 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761507 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/faba1ad1-aeda-412d-9824-36cc045bab86-client-ca\") pod \"route-controller-manager-6576b87f9c-mqkcq\" (UID: \"faba1ad1-aeda-412d-9824-36cc045bab86\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761525 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761548 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2ce6eded-da13-4bb7-a87d-71b87d0e7f06-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6k2g8\" (UID: \"2ce6eded-da13-4bb7-a87d-71b87d0e7f06\") " pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761603 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cde5d02-8e0d-4b24-b7bc-b9365013d942-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-d5pqr\" (UID: \"7cde5d02-8e0d-4b24-b7bc-b9365013d942\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d5pqr" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761634 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1d929eaa-807c-4809-8b8a-78c186418e71-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-j8ggj\" (UID: \"1d929eaa-807c-4809-8b8a-78c186418e71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761651 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/527ee9be-17be-4352-86fc-ef31bece3e86-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-bbjwp\" (UID: \"527ee9be-17be-4352-86fc-ef31bece3e86\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bbjwp" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761676 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b09fea83-e0d3-4a40-b186-8432c3fa7be0-bound-sa-token\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761692 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761870 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxd8k\" (UniqueName: \"kubernetes.io/projected/28e27ee8-4574-4731-9324-031f9b3a209f-kube-api-access-kxd8k\") pod \"kube-storage-version-migrator-operator-b67b599dd-82b8f\" (UID: \"28e27ee8-4574-4731-9324-031f9b3a209f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-82b8f" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761904 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/81a4453c-e1e8-4624-a19b-f08ec4df93d7-profile-collector-cert\") pod \"olm-operator-6b444d44fb-vmmzq\" (UID: \"81a4453c-e1e8-4624-a19b-f08ec4df93d7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761947 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/074c752f-fec1-4bd6-8773-596461ea288a-default-certificate\") pod \"router-default-5444994796-kbpk6\" (UID: \"074c752f-fec1-4bd6-8773-596461ea288a\") " pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761978 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.761996 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxzwc\" (UniqueName: \"kubernetes.io/projected/4a5192d8-6708-48c6-b5e5-a081f89d3e66-kube-api-access-gxzwc\") pod \"service-ca-operator-777779d784-ttbrn\" (UID: \"4a5192d8-6708-48c6-b5e5-a081f89d3e66\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ttbrn" Feb 17 13:27:49 crc kubenswrapper[4804]: E0217 13:27:49.762258 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:50.262246258 +0000 UTC m=+144.373665595 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.763811 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.770247 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/074c752f-fec1-4bd6-8773-596461ea288a-stats-auth\") pod \"router-default-5444994796-kbpk6\" (UID: \"074c752f-fec1-4bd6-8773-596461ea288a\") " pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.770657 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.770686 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.771011 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f8789cf-f788-4c81-9624-532aa823de1c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-4shqj\" (UID: \"6f8789cf-f788-4c81-9624-532aa823de1c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4shqj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.771350 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.771477 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9400eb64-255c-46c2-b6c6-39260e013e92-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-vjbls\" (UID: \"9400eb64-255c-46c2-b6c6-39260e013e92\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjbls" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.771523 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcjhr\" (UniqueName: \"kubernetes.io/projected/2aaa28d2-1ca6-42c3-98f7-58c644a03061-kube-api-access-wcjhr\") pod \"package-server-manager-789f6589d5-6xkvs\" (UID: \"2aaa28d2-1ca6-42c3-98f7-58c644a03061\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xkvs" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.771573 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b09fea83-e0d3-4a40-b186-8432c3fa7be0-registry-tls\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.771607 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1d929eaa-807c-4809-8b8a-78c186418e71-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-j8ggj\" (UID: \"1d929eaa-807c-4809-8b8a-78c186418e71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.771788 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x554\" (UniqueName: \"kubernetes.io/projected/1d929eaa-807c-4809-8b8a-78c186418e71-kube-api-access-8x554\") pod \"controller-manager-879f6c89f-j8ggj\" (UID: \"1d929eaa-807c-4809-8b8a-78c186418e71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.771926 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/faba1ad1-aeda-412d-9824-36cc045bab86-config\") pod \"route-controller-manager-6576b87f9c-mqkcq\" (UID: \"faba1ad1-aeda-412d-9824-36cc045bab86\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.772006 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d929eaa-807c-4809-8b8a-78c186418e71-client-ca\") pod \"controller-manager-879f6c89f-j8ggj\" (UID: \"1d929eaa-807c-4809-8b8a-78c186418e71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.772035 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnm4h\" (UniqueName: \"kubernetes.io/projected/ea50fe9b-465a-448b-97db-a91822afb720-kube-api-access-qnm4h\") pod \"migrator-59844c95c7-q46rz\" (UID: \"ea50fe9b-465a-448b-97db-a91822afb720\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q46rz" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.773265 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d929eaa-807c-4809-8b8a-78c186418e71-client-ca\") pod \"controller-manager-879f6c89f-j8ggj\" (UID: \"1d929eaa-807c-4809-8b8a-78c186418e71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.773345 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.773361 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9400eb64-255c-46c2-b6c6-39260e013e92-config\") pod \"kube-apiserver-operator-766d6c64bb-vjbls\" (UID: \"9400eb64-255c-46c2-b6c6-39260e013e92\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjbls" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.773409 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7mtv\" (UniqueName: \"kubernetes.io/projected/a6882836-eb39-412c-a0d6-4906c9be9b89-kube-api-access-l7mtv\") pod \"machine-config-operator-74547568cd-trrpm\" (UID: \"a6882836-eb39-412c-a0d6-4906c9be9b89\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.773427 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/faba1ad1-aeda-412d-9824-36cc045bab86-config\") pod \"route-controller-manager-6576b87f9c-mqkcq\" (UID: \"faba1ad1-aeda-412d-9824-36cc045bab86\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.773539 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a5192d8-6708-48c6-b5e5-a081f89d3e66-serving-cert\") pod \"service-ca-operator-777779d784-ttbrn\" (UID: \"4a5192d8-6708-48c6-b5e5-a081f89d3e66\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ttbrn" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.773568 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3768c453-c58d-4768-9620-a202cbb8ccd8-config-volume\") pod \"collect-profiles-29522235-lnbg8\" (UID: \"3768c453-c58d-4768-9620-a202cbb8ccd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.773614 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.773640 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/fd882ba0-9d9f-4a38-8a48-ab4d146fff56-csi-data-dir\") pod \"csi-hostpathplugin-vwjsv\" (UID: \"fd882ba0-9d9f-4a38-8a48-ab4d146fff56\") " pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.773664 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/360a1093-b581-4806-9f88-3d3907bd4895-tmpfs\") pod \"packageserver-d55dfcdfc-9d9jq\" (UID: \"360a1093-b581-4806-9f88-3d3907bd4895\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.773732 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b09fea83-e0d3-4a40-b186-8432c3fa7be0-registry-certificates\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.773754 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/70a41b60-6ec1-491d-9d3e-88758d91c45e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-b8qc5\" (UID: \"70a41b60-6ec1-491d-9d3e-88758d91c45e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.773805 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54d9r\" (UniqueName: \"kubernetes.io/projected/fe7f1d16-c21b-4de3-9f7b-d8bfe8f026c6-kube-api-access-54d9r\") pod \"service-ca-9c57cc56f-5sp6x\" (UID: \"fe7f1d16-c21b-4de3-9f7b-d8bfe8f026c6\") " pod="openshift-service-ca/service-ca-9c57cc56f-5sp6x" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.773872 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgt6h\" (UniqueName: \"kubernetes.io/projected/074c752f-fec1-4bd6-8773-596461ea288a-kube-api-access-rgt6h\") pod \"router-default-5444994796-kbpk6\" (UID: \"074c752f-fec1-4bd6-8773-596461ea288a\") " pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.773888 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9400eb64-255c-46c2-b6c6-39260e013e92-config\") pod \"kube-apiserver-operator-766d6c64bb-vjbls\" (UID: \"9400eb64-255c-46c2-b6c6-39260e013e92\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjbls" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.773894 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.773955 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1975682c-3445-467d-a0bd-a87b0ebf604b-node-bootstrap-token\") pod \"machine-config-server-ssf69\" (UID: \"1975682c-3445-467d-a0bd-a87b0ebf604b\") " pod="openshift-machine-config-operator/machine-config-server-ssf69" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.773975 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a6882836-eb39-412c-a0d6-4906c9be9b89-images\") pod \"machine-config-operator-74547568cd-trrpm\" (UID: \"a6882836-eb39-412c-a0d6-4906c9be9b89\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.774008 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fd882ba0-9d9f-4a38-8a48-ab4d146fff56-socket-dir\") pod \"csi-hostpathplugin-vwjsv\" (UID: \"fd882ba0-9d9f-4a38-8a48-ab4d146fff56\") " pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.774256 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70a41b60-6ec1-491d-9d3e-88758d91c45e-serving-cert\") pod \"openshift-config-operator-7777fb866f-b8qc5\" (UID: \"70a41b60-6ec1-491d-9d3e-88758d91c45e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.774284 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx9tr\" (UniqueName: \"kubernetes.io/projected/6c98dfab-f166-4eb4-b385-724d6b9b9d7a-kube-api-access-kx9tr\") pod \"control-plane-machine-set-operator-78cbb6b69f-t4m4g\" (UID: \"6c98dfab-f166-4eb4-b385-724d6b9b9d7a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4m4g" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.774351 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.774605 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp6h2\" (UniqueName: \"kubernetes.io/projected/bd4df830-6ec9-4f4d-860e-f97af3088371-kube-api-access-pp6h2\") pod \"ingress-operator-5b745b69d9-w4gnh\" (UID: \"bd4df830-6ec9-4f4d-860e-f97af3088371\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.774638 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78dad77c-6d3f-43bc-93a3-ecd7dce378f3-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-gffmb\" (UID: \"78dad77c-6d3f-43bc-93a3-ecd7dce378f3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gffmb" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.774669 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/fd882ba0-9d9f-4a38-8a48-ab4d146fff56-mountpoint-dir\") pod \"csi-hostpathplugin-vwjsv\" (UID: \"fd882ba0-9d9f-4a38-8a48-ab4d146fff56\") " pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.774745 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/faba1ad1-aeda-412d-9824-36cc045bab86-serving-cert\") pod \"route-controller-manager-6576b87f9c-mqkcq\" (UID: \"faba1ad1-aeda-412d-9824-36cc045bab86\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.774764 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfbln\" (UniqueName: \"kubernetes.io/projected/81f879fe-7bd1-42d0-b026-80f901641a0b-kube-api-access-vfbln\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.774783 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fd233b99-2205-4e95-ba04-232015517afb-srv-cert\") pod \"catalog-operator-68c6474976-645bx\" (UID: \"fd233b99-2205-4e95-ba04-232015517afb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.774800 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28e27ee8-4574-4731-9324-031f9b3a209f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-82b8f\" (UID: \"28e27ee8-4574-4731-9324-031f9b3a209f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-82b8f" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.774803 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/70a41b60-6ec1-491d-9d3e-88758d91c45e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-b8qc5\" (UID: \"70a41b60-6ec1-491d-9d3e-88758d91c45e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.774822 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b09fea83-e0d3-4a40-b186-8432c3fa7be0-trusted-ca\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.775691 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78dad77c-6d3f-43bc-93a3-ecd7dce378f3-config\") pod \"openshift-apiserver-operator-796bbdcf4f-gffmb\" (UID: \"78dad77c-6d3f-43bc-93a3-ecd7dce378f3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gffmb" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.775726 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/fd882ba0-9d9f-4a38-8a48-ab4d146fff56-plugins-dir\") pod \"csi-hostpathplugin-vwjsv\" (UID: \"fd882ba0-9d9f-4a38-8a48-ab4d146fff56\") " pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.775789 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cde5d02-8e0d-4b24-b7bc-b9365013d942-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-d5pqr\" (UID: \"7cde5d02-8e0d-4b24-b7bc-b9365013d942\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d5pqr" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.775815 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khbwg\" (UniqueName: \"kubernetes.io/projected/fd882ba0-9d9f-4a38-8a48-ab4d146fff56-kube-api-access-khbwg\") pod \"csi-hostpathplugin-vwjsv\" (UID: \"fd882ba0-9d9f-4a38-8a48-ab4d146fff56\") " pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.775838 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4s62\" (UniqueName: \"kubernetes.io/projected/fd233b99-2205-4e95-ba04-232015517afb-kube-api-access-g4s62\") pod \"catalog-operator-68c6474976-645bx\" (UID: \"fd233b99-2205-4e95-ba04-232015517afb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.775915 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k8dn\" (UniqueName: \"kubernetes.io/projected/8e609565-a380-48f1-9b14-542a17c4ea50-kube-api-access-2k8dn\") pod \"ingress-canary-td8n5\" (UID: \"8e609565-a380-48f1-9b14-542a17c4ea50\") " pod="openshift-ingress-canary/ingress-canary-td8n5" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.776041 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.776058 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b09fea83-e0d3-4a40-b186-8432c3fa7be0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.776980 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9400eb64-255c-46c2-b6c6-39260e013e92-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-vjbls\" (UID: \"9400eb64-255c-46c2-b6c6-39260e013e92\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjbls" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.777330 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.777752 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cde5d02-8e0d-4b24-b7bc-b9365013d942-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-d5pqr\" (UID: \"7cde5d02-8e0d-4b24-b7bc-b9365013d942\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d5pqr" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.778052 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.778672 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b09fea83-e0d3-4a40-b186-8432c3fa7be0-trusted-ca\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.779937 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cde5d02-8e0d-4b24-b7bc-b9365013d942-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-d5pqr\" (UID: \"7cde5d02-8e0d-4b24-b7bc-b9365013d942\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d5pqr" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.782340 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70a41b60-6ec1-491d-9d3e-88758d91c45e-serving-cert\") pod \"openshift-config-operator-7777fb866f-b8qc5\" (UID: \"70a41b60-6ec1-491d-9d3e-88758d91c45e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.782881 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b09fea83-e0d3-4a40-b186-8432c3fa7be0-registry-certificates\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.782949 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/faba1ad1-aeda-412d-9824-36cc045bab86-serving-cert\") pod \"route-controller-manager-6576b87f9c-mqkcq\" (UID: \"faba1ad1-aeda-412d-9824-36cc045bab86\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.783494 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/074c752f-fec1-4bd6-8773-596461ea288a-default-certificate\") pod \"router-default-5444994796-kbpk6\" (UID: \"074c752f-fec1-4bd6-8773-596461ea288a\") " pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.784371 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d929eaa-807c-4809-8b8a-78c186418e71-serving-cert\") pod \"controller-manager-879f6c89f-j8ggj\" (UID: \"1d929eaa-807c-4809-8b8a-78c186418e71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.784725 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78dad77c-6d3f-43bc-93a3-ecd7dce378f3-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-gffmb\" (UID: \"78dad77c-6d3f-43bc-93a3-ecd7dce378f3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gffmb" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.785443 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/074c752f-fec1-4bd6-8773-596461ea288a-metrics-certs\") pod \"router-default-5444994796-kbpk6\" (UID: \"074c752f-fec1-4bd6-8773-596461ea288a\") " pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.788023 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78dad77c-6d3f-43bc-93a3-ecd7dce378f3-config\") pod \"openshift-apiserver-operator-796bbdcf4f-gffmb\" (UID: \"78dad77c-6d3f-43bc-93a3-ecd7dce378f3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gffmb" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.799641 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwgww\" (UniqueName: \"kubernetes.io/projected/6f8789cf-f788-4c81-9624-532aa823de1c-kube-api-access-cwgww\") pod \"openshift-controller-manager-operator-756b6f6bc6-4shqj\" (UID: \"6f8789cf-f788-4c81-9624-532aa823de1c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4shqj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.800467 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/faba1ad1-aeda-412d-9824-36cc045bab86-client-ca\") pod \"route-controller-manager-6576b87f9c-mqkcq\" (UID: \"faba1ad1-aeda-412d-9824-36cc045bab86\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.801710 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.811525 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c3cd53a-4a82-449d-a270-b41853fa2c8a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-9t2lx\" (UID: \"4c3cd53a-4a82-449d-a270-b41853fa2c8a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9t2lx" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.819290 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bd4df830-6ec9-4f4d-860e-f97af3088371-bound-sa-token\") pod \"ingress-operator-5b745b69d9-w4gnh\" (UID: \"bd4df830-6ec9-4f4d-860e-f97af3088371\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.838920 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4c3cd53a-4a82-449d-a270-b41853fa2c8a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-9t2lx\" (UID: \"4c3cd53a-4a82-449d-a270-b41853fa2c8a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9t2lx" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.859484 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7dtm\" (UniqueName: \"kubernetes.io/projected/78dad77c-6d3f-43bc-93a3-ecd7dce378f3-kube-api-access-k7dtm\") pod \"openshift-apiserver-operator-796bbdcf4f-gffmb\" (UID: \"78dad77c-6d3f-43bc-93a3-ecd7dce378f3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gffmb" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.876953 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877111 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wz5b\" (UniqueName: \"kubernetes.io/projected/faba1ad1-aeda-412d-9824-36cc045bab86-kube-api-access-2wz5b\") pod \"route-controller-manager-6576b87f9c-mqkcq\" (UID: \"faba1ad1-aeda-412d-9824-36cc045bab86\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877116 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnm4h\" (UniqueName: \"kubernetes.io/projected/ea50fe9b-465a-448b-97db-a91822afb720-kube-api-access-qnm4h\") pod \"migrator-59844c95c7-q46rz\" (UID: \"ea50fe9b-465a-448b-97db-a91822afb720\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q46rz" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877159 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7mtv\" (UniqueName: \"kubernetes.io/projected/a6882836-eb39-412c-a0d6-4906c9be9b89-kube-api-access-l7mtv\") pod \"machine-config-operator-74547568cd-trrpm\" (UID: \"a6882836-eb39-412c-a0d6-4906c9be9b89\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877178 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a5192d8-6708-48c6-b5e5-a081f89d3e66-serving-cert\") pod \"service-ca-operator-777779d784-ttbrn\" (UID: \"4a5192d8-6708-48c6-b5e5-a081f89d3e66\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ttbrn" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877212 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3768c453-c58d-4768-9620-a202cbb8ccd8-config-volume\") pod \"collect-profiles-29522235-lnbg8\" (UID: \"3768c453-c58d-4768-9620-a202cbb8ccd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877239 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/fd882ba0-9d9f-4a38-8a48-ab4d146fff56-csi-data-dir\") pod \"csi-hostpathplugin-vwjsv\" (UID: \"fd882ba0-9d9f-4a38-8a48-ab4d146fff56\") " pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877260 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/360a1093-b581-4806-9f88-3d3907bd4895-tmpfs\") pod \"packageserver-d55dfcdfc-9d9jq\" (UID: \"360a1093-b581-4806-9f88-3d3907bd4895\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" Feb 17 13:27:49 crc kubenswrapper[4804]: E0217 13:27:49.877289 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:50.377277261 +0000 UTC m=+144.488696598 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877307 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54d9r\" (UniqueName: \"kubernetes.io/projected/fe7f1d16-c21b-4de3-9f7b-d8bfe8f026c6-kube-api-access-54d9r\") pod \"service-ca-9c57cc56f-5sp6x\" (UID: \"fe7f1d16-c21b-4de3-9f7b-d8bfe8f026c6\") " pod="openshift-service-ca/service-ca-9c57cc56f-5sp6x" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877341 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1975682c-3445-467d-a0bd-a87b0ebf604b-node-bootstrap-token\") pod \"machine-config-server-ssf69\" (UID: \"1975682c-3445-467d-a0bd-a87b0ebf604b\") " pod="openshift-machine-config-operator/machine-config-server-ssf69" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877357 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a6882836-eb39-412c-a0d6-4906c9be9b89-images\") pod \"machine-config-operator-74547568cd-trrpm\" (UID: \"a6882836-eb39-412c-a0d6-4906c9be9b89\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877377 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fd882ba0-9d9f-4a38-8a48-ab4d146fff56-socket-dir\") pod \"csi-hostpathplugin-vwjsv\" (UID: \"fd882ba0-9d9f-4a38-8a48-ab4d146fff56\") " pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877394 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx9tr\" (UniqueName: \"kubernetes.io/projected/6c98dfab-f166-4eb4-b385-724d6b9b9d7a-kube-api-access-kx9tr\") pod \"control-plane-machine-set-operator-78cbb6b69f-t4m4g\" (UID: \"6c98dfab-f166-4eb4-b385-724d6b9b9d7a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4m4g" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877414 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/fd882ba0-9d9f-4a38-8a48-ab4d146fff56-mountpoint-dir\") pod \"csi-hostpathplugin-vwjsv\" (UID: \"fd882ba0-9d9f-4a38-8a48-ab4d146fff56\") " pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877434 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fd233b99-2205-4e95-ba04-232015517afb-srv-cert\") pod \"catalog-operator-68c6474976-645bx\" (UID: \"fd233b99-2205-4e95-ba04-232015517afb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877451 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28e27ee8-4574-4731-9324-031f9b3a209f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-82b8f\" (UID: \"28e27ee8-4574-4731-9324-031f9b3a209f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-82b8f" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877466 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/fd882ba0-9d9f-4a38-8a48-ab4d146fff56-plugins-dir\") pod \"csi-hostpathplugin-vwjsv\" (UID: \"fd882ba0-9d9f-4a38-8a48-ab4d146fff56\") " pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877483 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khbwg\" (UniqueName: \"kubernetes.io/projected/fd882ba0-9d9f-4a38-8a48-ab4d146fff56-kube-api-access-khbwg\") pod \"csi-hostpathplugin-vwjsv\" (UID: \"fd882ba0-9d9f-4a38-8a48-ab4d146fff56\") " pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877498 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4s62\" (UniqueName: \"kubernetes.io/projected/fd233b99-2205-4e95-ba04-232015517afb-kube-api-access-g4s62\") pod \"catalog-operator-68c6474976-645bx\" (UID: \"fd233b99-2205-4e95-ba04-232015517afb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877514 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k8dn\" (UniqueName: \"kubernetes.io/projected/8e609565-a380-48f1-9b14-542a17c4ea50-kube-api-access-2k8dn\") pod \"ingress-canary-td8n5\" (UID: \"8e609565-a380-48f1-9b14-542a17c4ea50\") " pod="openshift-ingress-canary/ingress-canary-td8n5" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877530 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3768c453-c58d-4768-9620-a202cbb8ccd8-secret-volume\") pod \"collect-profiles-29522235-lnbg8\" (UID: \"3768c453-c58d-4768-9620-a202cbb8ccd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877547 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h988v\" (UniqueName: \"kubernetes.io/projected/c36c8731-9ee6-4ce6-8708-9e35e6112804-kube-api-access-h988v\") pod \"dns-default-d6mxf\" (UID: \"c36c8731-9ee6-4ce6-8708-9e35e6112804\") " pod="openshift-dns/dns-default-d6mxf" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877561 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fe7f1d16-c21b-4de3-9f7b-d8bfe8f026c6-signing-cabundle\") pod \"service-ca-9c57cc56f-5sp6x\" (UID: \"fe7f1d16-c21b-4de3-9f7b-d8bfe8f026c6\") " pod="openshift-service-ca/service-ca-9c57cc56f-5sp6x" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877579 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-228st\" (UniqueName: \"kubernetes.io/projected/527ee9be-17be-4352-86fc-ef31bece3e86-kube-api-access-228st\") pod \"multus-admission-controller-857f4d67dd-bbjwp\" (UID: \"527ee9be-17be-4352-86fc-ef31bece3e86\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bbjwp" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877594 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c36c8731-9ee6-4ce6-8708-9e35e6112804-metrics-tls\") pod \"dns-default-d6mxf\" (UID: \"c36c8731-9ee6-4ce6-8708-9e35e6112804\") " pod="openshift-dns/dns-default-d6mxf" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877611 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c98dfab-f166-4eb4-b385-724d6b9b9d7a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-t4m4g\" (UID: \"6c98dfab-f166-4eb4-b385-724d6b9b9d7a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4m4g" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877631 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4gxw\" (UniqueName: \"kubernetes.io/projected/2ce6eded-da13-4bb7-a87d-71b87d0e7f06-kube-api-access-x4gxw\") pod \"marketplace-operator-79b997595-6k2g8\" (UID: \"2ce6eded-da13-4bb7-a87d-71b87d0e7f06\") " pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877653 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fd233b99-2205-4e95-ba04-232015517afb-profile-collector-cert\") pod \"catalog-operator-68c6474976-645bx\" (UID: \"fd233b99-2205-4e95-ba04-232015517afb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877675 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a5192d8-6708-48c6-b5e5-a081f89d3e66-config\") pod \"service-ca-operator-777779d784-ttbrn\" (UID: \"4a5192d8-6708-48c6-b5e5-a081f89d3e66\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ttbrn" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877697 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1975682c-3445-467d-a0bd-a87b0ebf604b-certs\") pod \"machine-config-server-ssf69\" (UID: \"1975682c-3445-467d-a0bd-a87b0ebf604b\") " pod="openshift-machine-config-operator/machine-config-server-ssf69" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877716 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fe7f1d16-c21b-4de3-9f7b-d8bfe8f026c6-signing-key\") pod \"service-ca-9c57cc56f-5sp6x\" (UID: \"fe7f1d16-c21b-4de3-9f7b-d8bfe8f026c6\") " pod="openshift-service-ca/service-ca-9c57cc56f-5sp6x" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877738 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fd882ba0-9d9f-4a38-8a48-ab4d146fff56-registration-dir\") pod \"csi-hostpathplugin-vwjsv\" (UID: \"fd882ba0-9d9f-4a38-8a48-ab4d146fff56\") " pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877770 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dzxb\" (UniqueName: \"kubernetes.io/projected/1975682c-3445-467d-a0bd-a87b0ebf604b-kube-api-access-4dzxb\") pod \"machine-config-server-ssf69\" (UID: \"1975682c-3445-467d-a0bd-a87b0ebf604b\") " pod="openshift-machine-config-operator/machine-config-server-ssf69" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877792 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/360a1093-b581-4806-9f88-3d3907bd4895-apiservice-cert\") pod \"packageserver-d55dfcdfc-9d9jq\" (UID: \"360a1093-b581-4806-9f88-3d3907bd4895\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877819 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5dmh\" (UniqueName: \"kubernetes.io/projected/81a4453c-e1e8-4624-a19b-f08ec4df93d7-kube-api-access-j5dmh\") pod \"olm-operator-6b444d44fb-vmmzq\" (UID: \"81a4453c-e1e8-4624-a19b-f08ec4df93d7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877842 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z9lv\" (UniqueName: \"kubernetes.io/projected/3768c453-c58d-4768-9620-a202cbb8ccd8-kube-api-access-7z9lv\") pod \"collect-profiles-29522235-lnbg8\" (UID: \"3768c453-c58d-4768-9620-a202cbb8ccd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877862 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/81a4453c-e1e8-4624-a19b-f08ec4df93d7-srv-cert\") pod \"olm-operator-6b444d44fb-vmmzq\" (UID: \"81a4453c-e1e8-4624-a19b-f08ec4df93d7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877884 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6882836-eb39-412c-a0d6-4906c9be9b89-proxy-tls\") pod \"machine-config-operator-74547568cd-trrpm\" (UID: \"a6882836-eb39-412c-a0d6-4906c9be9b89\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877905 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spwnl\" (UniqueName: \"kubernetes.io/projected/360a1093-b581-4806-9f88-3d3907bd4895-kube-api-access-spwnl\") pod \"packageserver-d55dfcdfc-9d9jq\" (UID: \"360a1093-b581-4806-9f88-3d3907bd4895\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877925 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e609565-a380-48f1-9b14-542a17c4ea50-cert\") pod \"ingress-canary-td8n5\" (UID: \"8e609565-a380-48f1-9b14-542a17c4ea50\") " pod="openshift-ingress-canary/ingress-canary-td8n5" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877947 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2ce6eded-da13-4bb7-a87d-71b87d0e7f06-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6k2g8\" (UID: \"2ce6eded-da13-4bb7-a87d-71b87d0e7f06\") " pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877972 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28e27ee8-4574-4731-9324-031f9b3a209f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-82b8f\" (UID: \"28e27ee8-4574-4731-9324-031f9b3a209f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-82b8f" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877996 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a6882836-eb39-412c-a0d6-4906c9be9b89-auth-proxy-config\") pod \"machine-config-operator-74547568cd-trrpm\" (UID: \"a6882836-eb39-412c-a0d6-4906c9be9b89\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.878047 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2aaa28d2-1ca6-42c3-98f7-58c644a03061-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6xkvs\" (UID: \"2aaa28d2-1ca6-42c3-98f7-58c644a03061\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xkvs" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.878071 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/360a1093-b581-4806-9f88-3d3907bd4895-webhook-cert\") pod \"packageserver-d55dfcdfc-9d9jq\" (UID: \"360a1093-b581-4806-9f88-3d3907bd4895\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.878100 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c36c8731-9ee6-4ce6-8708-9e35e6112804-config-volume\") pod \"dns-default-d6mxf\" (UID: \"c36c8731-9ee6-4ce6-8708-9e35e6112804\") " pod="openshift-dns/dns-default-d6mxf" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.878133 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2ce6eded-da13-4bb7-a87d-71b87d0e7f06-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6k2g8\" (UID: \"2ce6eded-da13-4bb7-a87d-71b87d0e7f06\") " pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.878171 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/527ee9be-17be-4352-86fc-ef31bece3e86-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-bbjwp\" (UID: \"527ee9be-17be-4352-86fc-ef31bece3e86\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bbjwp" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.878223 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxd8k\" (UniqueName: \"kubernetes.io/projected/28e27ee8-4574-4731-9324-031f9b3a209f-kube-api-access-kxd8k\") pod \"kube-storage-version-migrator-operator-b67b599dd-82b8f\" (UID: \"28e27ee8-4574-4731-9324-031f9b3a209f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-82b8f" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.878246 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/81a4453c-e1e8-4624-a19b-f08ec4df93d7-profile-collector-cert\") pod \"olm-operator-6b444d44fb-vmmzq\" (UID: \"81a4453c-e1e8-4624-a19b-f08ec4df93d7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.878269 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3768c453-c58d-4768-9620-a202cbb8ccd8-config-volume\") pod \"collect-profiles-29522235-lnbg8\" (UID: \"3768c453-c58d-4768-9620-a202cbb8ccd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.878274 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.878304 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxzwc\" (UniqueName: \"kubernetes.io/projected/4a5192d8-6708-48c6-b5e5-a081f89d3e66-kube-api-access-gxzwc\") pod \"service-ca-operator-777779d784-ttbrn\" (UID: \"4a5192d8-6708-48c6-b5e5-a081f89d3e66\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ttbrn" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.878326 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcjhr\" (UniqueName: \"kubernetes.io/projected/2aaa28d2-1ca6-42c3-98f7-58c644a03061-kube-api-access-wcjhr\") pod \"package-server-manager-789f6589d5-6xkvs\" (UID: \"2aaa28d2-1ca6-42c3-98f7-58c644a03061\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xkvs" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.878479 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/fd882ba0-9d9f-4a38-8a48-ab4d146fff56-csi-data-dir\") pod \"csi-hostpathplugin-vwjsv\" (UID: \"fd882ba0-9d9f-4a38-8a48-ab4d146fff56\") " pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" Feb 17 13:27:49 crc kubenswrapper[4804]: E0217 13:27:49.878519 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:50.378508401 +0000 UTC m=+144.489927738 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.878981 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a6882836-eb39-412c-a0d6-4906c9be9b89-images\") pod \"machine-config-operator-74547568cd-trrpm\" (UID: \"a6882836-eb39-412c-a0d6-4906c9be9b89\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.879345 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fd882ba0-9d9f-4a38-8a48-ab4d146fff56-socket-dir\") pod \"csi-hostpathplugin-vwjsv\" (UID: \"fd882ba0-9d9f-4a38-8a48-ab4d146fff56\") " pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.879408 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/fd882ba0-9d9f-4a38-8a48-ab4d146fff56-mountpoint-dir\") pod \"csi-hostpathplugin-vwjsv\" (UID: \"fd882ba0-9d9f-4a38-8a48-ab4d146fff56\") " pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.880788 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a6882836-eb39-412c-a0d6-4906c9be9b89-auth-proxy-config\") pod \"machine-config-operator-74547568cd-trrpm\" (UID: \"a6882836-eb39-412c-a0d6-4906c9be9b89\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.881401 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a5192d8-6708-48c6-b5e5-a081f89d3e66-serving-cert\") pod \"service-ca-operator-777779d784-ttbrn\" (UID: \"4a5192d8-6708-48c6-b5e5-a081f89d3e66\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ttbrn" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.877632 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/360a1093-b581-4806-9f88-3d3907bd4895-tmpfs\") pod \"packageserver-d55dfcdfc-9d9jq\" (UID: \"360a1093-b581-4806-9f88-3d3907bd4895\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.881820 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/fd882ba0-9d9f-4a38-8a48-ab4d146fff56-plugins-dir\") pod \"csi-hostpathplugin-vwjsv\" (UID: \"fd882ba0-9d9f-4a38-8a48-ab4d146fff56\") " pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.883069 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fd233b99-2205-4e95-ba04-232015517afb-srv-cert\") pod \"catalog-operator-68c6474976-645bx\" (UID: \"fd233b99-2205-4e95-ba04-232015517afb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.883610 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fd233b99-2205-4e95-ba04-232015517afb-profile-collector-cert\") pod \"catalog-operator-68c6474976-645bx\" (UID: \"fd233b99-2205-4e95-ba04-232015517afb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.883753 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1975682c-3445-467d-a0bd-a87b0ebf604b-node-bootstrap-token\") pod \"machine-config-server-ssf69\" (UID: \"1975682c-3445-467d-a0bd-a87b0ebf604b\") " pod="openshift-machine-config-operator/machine-config-server-ssf69" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.886189 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a5192d8-6708-48c6-b5e5-a081f89d3e66-config\") pod \"service-ca-operator-777779d784-ttbrn\" (UID: \"4a5192d8-6708-48c6-b5e5-a081f89d3e66\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ttbrn" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.886246 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fe7f1d16-c21b-4de3-9f7b-d8bfe8f026c6-signing-cabundle\") pod \"service-ca-9c57cc56f-5sp6x\" (UID: \"fe7f1d16-c21b-4de3-9f7b-d8bfe8f026c6\") " pod="openshift-service-ca/service-ca-9c57cc56f-5sp6x" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.886579 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c98dfab-f166-4eb4-b385-724d6b9b9d7a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-t4m4g\" (UID: \"6c98dfab-f166-4eb4-b385-724d6b9b9d7a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4m4g" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.886740 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6882836-eb39-412c-a0d6-4906c9be9b89-proxy-tls\") pod \"machine-config-operator-74547568cd-trrpm\" (UID: \"a6882836-eb39-412c-a0d6-4906c9be9b89\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.886968 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fd882ba0-9d9f-4a38-8a48-ab4d146fff56-registration-dir\") pod \"csi-hostpathplugin-vwjsv\" (UID: \"fd882ba0-9d9f-4a38-8a48-ab4d146fff56\") " pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.887323 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/81a4453c-e1e8-4624-a19b-f08ec4df93d7-srv-cert\") pod \"olm-operator-6b444d44fb-vmmzq\" (UID: \"81a4453c-e1e8-4624-a19b-f08ec4df93d7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.887701 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e609565-a380-48f1-9b14-542a17c4ea50-cert\") pod \"ingress-canary-td8n5\" (UID: \"8e609565-a380-48f1-9b14-542a17c4ea50\") " pod="openshift-ingress-canary/ingress-canary-td8n5" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.887866 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28e27ee8-4574-4731-9324-031f9b3a209f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-82b8f\" (UID: \"28e27ee8-4574-4731-9324-031f9b3a209f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-82b8f" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.888645 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1975682c-3445-467d-a0bd-a87b0ebf604b-certs\") pod \"machine-config-server-ssf69\" (UID: \"1975682c-3445-467d-a0bd-a87b0ebf604b\") " pod="openshift-machine-config-operator/machine-config-server-ssf69" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.889063 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/81a4453c-e1e8-4624-a19b-f08ec4df93d7-profile-collector-cert\") pod \"olm-operator-6b444d44fb-vmmzq\" (UID: \"81a4453c-e1e8-4624-a19b-f08ec4df93d7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.889128 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/360a1093-b581-4806-9f88-3d3907bd4895-apiservice-cert\") pod \"packageserver-d55dfcdfc-9d9jq\" (UID: \"360a1093-b581-4806-9f88-3d3907bd4895\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.889272 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3768c453-c58d-4768-9620-a202cbb8ccd8-secret-volume\") pod \"collect-profiles-29522235-lnbg8\" (UID: \"3768c453-c58d-4768-9620-a202cbb8ccd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.889619 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2ce6eded-da13-4bb7-a87d-71b87d0e7f06-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6k2g8\" (UID: \"2ce6eded-da13-4bb7-a87d-71b87d0e7f06\") " pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.890558 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/527ee9be-17be-4352-86fc-ef31bece3e86-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-bbjwp\" (UID: \"527ee9be-17be-4352-86fc-ef31bece3e86\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bbjwp" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.891043 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2aaa28d2-1ca6-42c3-98f7-58c644a03061-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6xkvs\" (UID: \"2aaa28d2-1ca6-42c3-98f7-58c644a03061\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xkvs" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.894994 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/360a1093-b581-4806-9f88-3d3907bd4895-webhook-cert\") pod \"packageserver-d55dfcdfc-9d9jq\" (UID: \"360a1093-b581-4806-9f88-3d3907bd4895\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.895146 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fe7f1d16-c21b-4de3-9f7b-d8bfe8f026c6-signing-key\") pod \"service-ca-9c57cc56f-5sp6x\" (UID: \"fe7f1d16-c21b-4de3-9f7b-d8bfe8f026c6\") " pod="openshift-service-ca/service-ca-9c57cc56f-5sp6x" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.895750 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2ce6eded-da13-4bb7-a87d-71b87d0e7f06-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6k2g8\" (UID: \"2ce6eded-da13-4bb7-a87d-71b87d0e7f06\") " pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.903235 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7cde5d02-8e0d-4b24-b7bc-b9365013d942-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-d5pqr\" (UID: \"7cde5d02-8e0d-4b24-b7bc-b9365013d942\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d5pqr" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.905513 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4shqj" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.907275 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c36c8731-9ee6-4ce6-8708-9e35e6112804-config-volume\") pod \"dns-default-d6mxf\" (UID: \"c36c8731-9ee6-4ce6-8708-9e35e6112804\") " pod="openshift-dns/dns-default-d6mxf" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.908806 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c36c8731-9ee6-4ce6-8708-9e35e6112804-metrics-tls\") pod \"dns-default-d6mxf\" (UID: \"c36c8731-9ee6-4ce6-8708-9e35e6112804\") " pod="openshift-dns/dns-default-d6mxf" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.910948 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" event={"ID":"88e84359-a2f8-4d55-96e4-fda2ff226372","Type":"ContainerStarted","Data":"459928bdb43f6f1a1118edd680405432a2ec9199feeecf6f85fc24cb7d9f2210"} Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.911660 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28e27ee8-4574-4731-9324-031f9b3a209f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-82b8f\" (UID: \"28e27ee8-4574-4731-9324-031f9b3a209f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-82b8f" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.912226 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-46w22" event={"ID":"3ea797e4-54e0-4063-8d2b-647f6686e2a8","Type":"ContainerStarted","Data":"81d9af50a49b7a22054002410dd3f03b59f94ff18986b6c02f393dc6b67d21c6"} Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.912678 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tzzkm"] Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.914550 4804 patch_prober.go:28] interesting pod/downloads-7954f5f757-w4nl5 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.914600 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-w4nl5" podUID="4c36b00a-bd3f-424c-a67b-d828d782e60f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.918544 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n95dt\" (UniqueName: \"kubernetes.io/projected/70a41b60-6ec1-491d-9d3e-88758d91c45e-kube-api-access-n95dt\") pod \"openshift-config-operator-7777fb866f-b8qc5\" (UID: \"70a41b60-6ec1-491d-9d3e-88758d91c45e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5" Feb 17 13:27:49 crc kubenswrapper[4804]: W0217 13:27:49.921071 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96df7f4c_b782_43e2_99b2_fa5219a59fd9.slice/crio-337c4bb83e1a716779c975b834d44dccf972ba542a98cfad7e63b00502637c35 WatchSource:0}: Error finding container 337c4bb83e1a716779c975b834d44dccf972ba542a98cfad7e63b00502637c35: Status 404 returned error can't find the container with id 337c4bb83e1a716779c975b834d44dccf972ba542a98cfad7e63b00502637c35 Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.928452 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9t2lx" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.939910 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqx9v\" (UniqueName: \"kubernetes.io/projected/b09fea83-e0d3-4a40-b186-8432c3fa7be0-kube-api-access-cqx9v\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.958620 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b09fea83-e0d3-4a40-b186-8432c3fa7be0-bound-sa-token\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.977968 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9400eb64-255c-46c2-b6c6-39260e013e92-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-vjbls\" (UID: \"9400eb64-255c-46c2-b6c6-39260e013e92\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjbls" Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.979941 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:49 crc kubenswrapper[4804]: E0217 13:27:49.980294 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:50.48026567 +0000 UTC m=+144.591685017 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:49 crc kubenswrapper[4804]: I0217 13:27:49.980749 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:49 crc kubenswrapper[4804]: E0217 13:27:49.996123 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:50.49610287 +0000 UTC m=+144.607522207 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.002106 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.003923 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x554\" (UniqueName: \"kubernetes.io/projected/1d929eaa-807c-4809-8b8a-78c186418e71-kube-api-access-8x554\") pod \"controller-manager-879f6c89f-j8ggj\" (UID: \"1d929eaa-807c-4809-8b8a-78c186418e71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.020804 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfbln\" (UniqueName: \"kubernetes.io/projected/81f879fe-7bd1-42d0-b026-80f901641a0b-kube-api-access-vfbln\") pod \"oauth-openshift-558db77b4-bstw9\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.023408 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gffmb" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.046456 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d5pqr" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.052822 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp6h2\" (UniqueName: \"kubernetes.io/projected/bd4df830-6ec9-4f4d-860e-f97af3088371-kube-api-access-pp6h2\") pod \"ingress-operator-5b745b69d9-w4gnh\" (UID: \"bd4df830-6ec9-4f4d-860e-f97af3088371\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.053702 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.059506 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgt6h\" (UniqueName: \"kubernetes.io/projected/074c752f-fec1-4bd6-8773-596461ea288a-kube-api-access-rgt6h\") pod \"router-default-5444994796-kbpk6\" (UID: \"074c752f-fec1-4bd6-8773-596461ea288a\") " pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.066443 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.077042 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-mcszv"] Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.085108 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.091333 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.092366 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:50 crc kubenswrapper[4804]: E0217 13:27:50.092970 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:50.592939425 +0000 UTC m=+144.704358762 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.094035 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-spfls"] Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.100774 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjbls" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.102142 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnm4h\" (UniqueName: \"kubernetes.io/projected/ea50fe9b-465a-448b-97db-a91822afb720-kube-api-access-qnm4h\") pod \"migrator-59844c95c7-q46rz\" (UID: \"ea50fe9b-465a-448b-97db-a91822afb720\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q46rz" Feb 17 13:27:50 crc kubenswrapper[4804]: W0217 13:27:50.116909 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfb5c679_7c23_47fe_92b2_e035dceef1be.slice/crio-5b51801517ec57dbe5966aafa0af9b7b049eb2350ff2d09ce72342347190d8de WatchSource:0}: Error finding container 5b51801517ec57dbe5966aafa0af9b7b049eb2350ff2d09ce72342347190d8de: Status 404 returned error can't find the container with id 5b51801517ec57dbe5966aafa0af9b7b049eb2350ff2d09ce72342347190d8de Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.121971 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcjhr\" (UniqueName: \"kubernetes.io/projected/2aaa28d2-1ca6-42c3-98f7-58c644a03061-kube-api-access-wcjhr\") pod \"package-server-manager-789f6589d5-6xkvs\" (UID: \"2aaa28d2-1ca6-42c3-98f7-58c644a03061\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xkvs" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.138854 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4shqj"] Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.157189 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54d9r\" (UniqueName: \"kubernetes.io/projected/fe7f1d16-c21b-4de3-9f7b-d8bfe8f026c6-kube-api-access-54d9r\") pod \"service-ca-9c57cc56f-5sp6x\" (UID: \"fe7f1d16-c21b-4de3-9f7b-d8bfe8f026c6\") " pod="openshift-service-ca/service-ca-9c57cc56f-5sp6x" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.163350 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q46rz" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.166596 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxzwc\" (UniqueName: \"kubernetes.io/projected/4a5192d8-6708-48c6-b5e5-a081f89d3e66-kube-api-access-gxzwc\") pod \"service-ca-operator-777779d784-ttbrn\" (UID: \"4a5192d8-6708-48c6-b5e5-a081f89d3e66\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ttbrn" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.176938 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9t2lx"] Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.186644 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx9tr\" (UniqueName: \"kubernetes.io/projected/6c98dfab-f166-4eb4-b385-724d6b9b9d7a-kube-api-access-kx9tr\") pod \"control-plane-machine-set-operator-78cbb6b69f-t4m4g\" (UID: \"6c98dfab-f166-4eb4-b385-724d6b9b9d7a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4m4g" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.197288 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:50 crc kubenswrapper[4804]: E0217 13:27:50.197715 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:50.697704093 +0000 UTC m=+144.809123430 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.205173 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7mtv\" (UniqueName: \"kubernetes.io/projected/a6882836-eb39-412c-a0d6-4906c9be9b89-kube-api-access-l7mtv\") pod \"machine-config-operator-74547568cd-trrpm\" (UID: \"a6882836-eb39-412c-a0d6-4906c9be9b89\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.244219 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dzxb\" (UniqueName: \"kubernetes.io/projected/1975682c-3445-467d-a0bd-a87b0ebf604b-kube-api-access-4dzxb\") pod \"machine-config-server-ssf69\" (UID: \"1975682c-3445-467d-a0bd-a87b0ebf604b\") " pod="openshift-machine-config-operator/machine-config-server-ssf69" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.248308 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4gxw\" (UniqueName: \"kubernetes.io/projected/2ce6eded-da13-4bb7-a87d-71b87d0e7f06-kube-api-access-x4gxw\") pod \"marketplace-operator-79b997595-6k2g8\" (UID: \"2ce6eded-da13-4bb7-a87d-71b87d0e7f06\") " pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.251521 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xkvs" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.262028 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k8dn\" (UniqueName: \"kubernetes.io/projected/8e609565-a380-48f1-9b14-542a17c4ea50-kube-api-access-2k8dn\") pod \"ingress-canary-td8n5\" (UID: \"8e609565-a380-48f1-9b14-542a17c4ea50\") " pod="openshift-ingress-canary/ingress-canary-td8n5" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.274823 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.282892 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ttbrn" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.298733 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:50 crc kubenswrapper[4804]: E0217 13:27:50.298965 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:50.798937214 +0000 UTC m=+144.910356551 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.299149 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5dmh\" (UniqueName: \"kubernetes.io/projected/81a4453c-e1e8-4624-a19b-f08ec4df93d7-kube-api-access-j5dmh\") pod \"olm-operator-6b444d44fb-vmmzq\" (UID: \"81a4453c-e1e8-4624-a19b-f08ec4df93d7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.299224 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:50 crc kubenswrapper[4804]: E0217 13:27:50.299744 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:50.799735751 +0000 UTC m=+144.911155168 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.309262 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-5sp6x" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.312255 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khbwg\" (UniqueName: \"kubernetes.io/projected/fd882ba0-9d9f-4a38-8a48-ab4d146fff56-kube-api-access-khbwg\") pod \"csi-hostpathplugin-vwjsv\" (UID: \"fd882ba0-9d9f-4a38-8a48-ab4d146fff56\") " pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.314572 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq"] Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.327353 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4s62\" (UniqueName: \"kubernetes.io/projected/fd233b99-2205-4e95-ba04-232015517afb-kube-api-access-g4s62\") pod \"catalog-operator-68c6474976-645bx\" (UID: \"fd233b99-2205-4e95-ba04-232015517afb\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.360289 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z9lv\" (UniqueName: \"kubernetes.io/projected/3768c453-c58d-4768-9620-a202cbb8ccd8-kube-api-access-7z9lv\") pod \"collect-profiles-29522235-lnbg8\" (UID: \"3768c453-c58d-4768-9620-a202cbb8ccd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.364637 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spwnl\" (UniqueName: \"kubernetes.io/projected/360a1093-b581-4806-9f88-3d3907bd4895-kube-api-access-spwnl\") pod \"packageserver-d55dfcdfc-9d9jq\" (UID: \"360a1093-b581-4806-9f88-3d3907bd4895\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.371660 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gffmb"] Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.371901 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4m4g" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.384117 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h988v\" (UniqueName: \"kubernetes.io/projected/c36c8731-9ee6-4ce6-8708-9e35e6112804-kube-api-access-h988v\") pod \"dns-default-d6mxf\" (UID: \"c36c8731-9ee6-4ce6-8708-9e35e6112804\") " pod="openshift-dns/dns-default-d6mxf" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.385977 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-td8n5" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.401832 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:50 crc kubenswrapper[4804]: E0217 13:27:50.402226 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:50.902191323 +0000 UTC m=+145.013610660 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.407699 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.407744 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-228st\" (UniqueName: \"kubernetes.io/projected/527ee9be-17be-4352-86fc-ef31bece3e86-kube-api-access-228st\") pod \"multus-admission-controller-857f4d67dd-bbjwp\" (UID: \"527ee9be-17be-4352-86fc-ef31bece3e86\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bbjwp" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.412059 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-d6mxf" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.419449 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxd8k\" (UniqueName: \"kubernetes.io/projected/28e27ee8-4574-4731-9324-031f9b3a209f-kube-api-access-kxd8k\") pod \"kube-storage-version-migrator-operator-b67b599dd-82b8f\" (UID: \"28e27ee8-4574-4731-9324-031f9b3a209f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-82b8f" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.424955 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-ssf69" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.426663 4804 csr.go:261] certificate signing request csr-pwcqm is approved, waiting to be issued Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.430253 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.434643 4804 csr.go:257] certificate signing request csr-pwcqm is issued Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.452494 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.502887 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-j8ggj"] Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.505012 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:50 crc kubenswrapper[4804]: E0217 13:27:50.505476 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:51.005460343 +0000 UTC m=+145.116879680 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.528693 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.532617 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d5pqr"] Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.546478 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.568523 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-bbjwp" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.599560 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-82b8f" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.600494 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bstw9"] Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.605961 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:50 crc kubenswrapper[4804]: E0217 13:27:50.606102 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:51.106079763 +0000 UTC m=+145.217499100 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.606287 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:50 crc kubenswrapper[4804]: E0217 13:27:50.606638 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:51.10662228 +0000 UTC m=+145.218041617 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.623164 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5"] Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.623457 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" Feb 17 13:27:50 crc kubenswrapper[4804]: W0217 13:27:50.626535 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d929eaa_807c_4809_8b8a_78c186418e71.slice/crio-c622d293f5967334c96859bbffeac805786523250407581ac4cdc458a4cd4b45 WatchSource:0}: Error finding container c622d293f5967334c96859bbffeac805786523250407581ac4cdc458a4cd4b45: Status 404 returned error can't find the container with id c622d293f5967334c96859bbffeac805786523250407581ac4cdc458a4cd4b45 Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.654666 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.707419 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:50 crc kubenswrapper[4804]: E0217 13:27:50.707663 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:51.207638204 +0000 UTC m=+145.319057541 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.715791 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:50 crc kubenswrapper[4804]: E0217 13:27:50.716667 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:51.216633585 +0000 UTC m=+145.328052922 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.731727 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-q46rz"] Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.770527 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjbls"] Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.816999 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:50 crc kubenswrapper[4804]: E0217 13:27:50.817216 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:51.317169903 +0000 UTC m=+145.428589240 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.817340 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:50 crc kubenswrapper[4804]: E0217 13:27:50.817695 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:51.31768154 +0000 UTC m=+145.429100877 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.836392 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xkvs"] Feb 17 13:27:50 crc kubenswrapper[4804]: W0217 13:27:50.837933 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea50fe9b_465a_448b_97db_a91822afb720.slice/crio-72a7688c67fdc9df6acab36749b98a85a0d0d109fea872cb149058d0c9e7d1c9 WatchSource:0}: Error finding container 72a7688c67fdc9df6acab36749b98a85a0d0d109fea872cb149058d0c9e7d1c9: Status 404 returned error can't find the container with id 72a7688c67fdc9df6acab36749b98a85a0d0d109fea872cb149058d0c9e7d1c9 Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.840090 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-tz5vz" podStartSLOduration=123.84007423 podStartE2EDuration="2m3.84007423s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:50.83946289 +0000 UTC m=+144.950882227" watchObservedRunningTime="2026-02-17 13:27:50.84007423 +0000 UTC m=+144.951493567" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.877002 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fw7rw" podStartSLOduration=123.876984647 podStartE2EDuration="2m3.876984647s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:50.874701341 +0000 UTC m=+144.986120678" watchObservedRunningTime="2026-02-17 13:27:50.876984647 +0000 UTC m=+144.988403984" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.918926 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:50 crc kubenswrapper[4804]: E0217 13:27:50.919086 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:51.419054316 +0000 UTC m=+145.530473663 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.919231 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:50 crc kubenswrapper[4804]: E0217 13:27:50.919543 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:51.419531522 +0000 UTC m=+145.530950859 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.919975 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v6t2f" podStartSLOduration=123.919958166 podStartE2EDuration="2m3.919958166s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:50.918313801 +0000 UTC m=+145.029733138" watchObservedRunningTime="2026-02-17 13:27:50.919958166 +0000 UTC m=+145.031377503" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.932321 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ttbrn"] Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.954056 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-62vhn" podStartSLOduration=123.954040797 podStartE2EDuration="2m3.954040797s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:50.953222181 +0000 UTC m=+145.064641518" watchObservedRunningTime="2026-02-17 13:27:50.954040797 +0000 UTC m=+145.065460134" Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.965649 4804 generic.go:334] "Generic (PLEG): container finished" podID="3ea797e4-54e0-4063-8d2b-647f6686e2a8" containerID="69ade887fb4561f7461039cedf4c40001910b0d18b0de5daf1a6aeffb6f8d6d9" exitCode=0 Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.965837 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-46w22" event={"ID":"3ea797e4-54e0-4063-8d2b-647f6686e2a8","Type":"ContainerDied","Data":"69ade887fb4561f7461039cedf4c40001910b0d18b0de5daf1a6aeffb6f8d6d9"} Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.966613 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gffmb" event={"ID":"78dad77c-6d3f-43bc-93a3-ecd7dce378f3","Type":"ContainerStarted","Data":"0a96e1ef2bfcf8764be3660e10a30ae67e6eb64a806638e281a0fd209ce60dfc"} Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.967598 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-kbpk6" event={"ID":"074c752f-fec1-4bd6-8773-596461ea288a","Type":"ContainerStarted","Data":"f04934fadfb13f4a2b94d23f826ccbf2c11587f3079cfc04ee775c0340ba1584"} Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.968720 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5" event={"ID":"70a41b60-6ec1-491d-9d3e-88758d91c45e","Type":"ContainerStarted","Data":"3d97fb8448b10f48b071b6a70d4f4f2987b70d4bf2286e1821fcf2cadb229b90"} Feb 17 13:27:50 crc kubenswrapper[4804]: I0217 13:27:50.981547 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4m4g"] Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.014979 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-h48zc" podStartSLOduration=124.014965439 podStartE2EDuration="2m4.014965439s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:51.012558627 +0000 UTC m=+145.123977964" watchObservedRunningTime="2026-02-17 13:27:51.014965439 +0000 UTC m=+145.126384776" Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.020674 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:51 crc kubenswrapper[4804]: E0217 13:27:51.021140 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:51.521122394 +0000 UTC m=+145.632541731 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.029654 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-spfls" event={"ID":"17c8a131-fc0e-44b5-b374-846e6b2aeb1c","Type":"ContainerStarted","Data":"45a96212ff94af6d68214bf3f1edff552b90d12c42957570b833f1872469a96a"} Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.029696 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-spfls" event={"ID":"17c8a131-fc0e-44b5-b374-846e6b2aeb1c","Type":"ContainerStarted","Data":"2cb41ce0e1d66729234c40bc09930a98f8f2dbab8039e3fe7eb214142ec4274f"} Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.067531 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tzzkm" event={"ID":"96df7f4c-b782-43e2-99b2-fa5219a59fd9","Type":"ContainerStarted","Data":"fd45a3b87ee7ec050a1e2226df399e2ca244c1384212608f1376c50fc62ba63e"} Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.067572 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tzzkm" event={"ID":"96df7f4c-b782-43e2-99b2-fa5219a59fd9","Type":"ContainerStarted","Data":"337c4bb83e1a716779c975b834d44dccf972ba542a98cfad7e63b00502637c35"} Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.069571 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9t2lx" event={"ID":"4c3cd53a-4a82-449d-a270-b41853fa2c8a","Type":"ContainerStarted","Data":"dcb906021bb914b0d35e536db8e77dce9a79a9b9c7a4d14ebe8fb578f4372c29"} Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.072487 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-mcszv" event={"ID":"bfb5c679-7c23-47fe-92b2-e035dceef1be","Type":"ContainerStarted","Data":"5b51801517ec57dbe5966aafa0af9b7b049eb2350ff2d09ce72342347190d8de"} Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.078289 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4shqj" event={"ID":"6f8789cf-f788-4c81-9624-532aa823de1c","Type":"ContainerStarted","Data":"a209eb83190d90d2fb6a3d22177034b2d0090e6a251f0ff17bf2d6cb44e252d6"} Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.083783 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" event={"ID":"faba1ad1-aeda-412d-9824-36cc045bab86","Type":"ContainerStarted","Data":"a4b6cbfefaf077ffe0f3e71671fde2907fe889b88fd4a0d27ee5e5b910c2832f"} Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.112144 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjbls" event={"ID":"9400eb64-255c-46c2-b6c6-39260e013e92","Type":"ContainerStarted","Data":"1e0db7b9855a1df421980bae6f948a4fd3ebe623b23861e2e967766f9a6951c3"} Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.126058 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:51 crc kubenswrapper[4804]: E0217 13:27:51.127134 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:51.627117025 +0000 UTC m=+145.738536362 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.136917 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d5pqr" event={"ID":"7cde5d02-8e0d-4b24-b7bc-b9365013d942","Type":"ContainerStarted","Data":"4622fca4f4493e3824ba6757145476211ce7042560569e7102a91e59e70f017e"} Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.139797 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-ssf69" event={"ID":"1975682c-3445-467d-a0bd-a87b0ebf604b","Type":"ContainerStarted","Data":"fad0a6ed87fc56a684f2075982688a0f7f794c18f1763206315edfb485e10c3f"} Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.156123 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" event={"ID":"1d929eaa-807c-4809-8b8a-78c186418e71","Type":"ContainerStarted","Data":"c622d293f5967334c96859bbffeac805786523250407581ac4cdc458a4cd4b45"} Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.161036 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q46rz" event={"ID":"ea50fe9b-465a-448b-97db-a91822afb720","Type":"ContainerStarted","Data":"72a7688c67fdc9df6acab36749b98a85a0d0d109fea872cb149058d0c9e7d1c9"} Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.163429 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" event={"ID":"81f879fe-7bd1-42d0-b026-80f901641a0b","Type":"ContainerStarted","Data":"71eeeb2236ea109e4995422167d6b6185d64b78a4f394944d8af1d30f1eaa147"} Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.164408 4804 patch_prober.go:28] interesting pod/downloads-7954f5f757-w4nl5 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.164454 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-w4nl5" podUID="4c36b00a-bd3f-424c-a67b-d828d782e60f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.227299 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:51 crc kubenswrapper[4804]: E0217 13:27:51.227663 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:51.727636372 +0000 UTC m=+145.839055709 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:51 crc kubenswrapper[4804]: W0217 13:27:51.320543 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a5192d8_6708_48c6_b5e5_a081f89d3e66.slice/crio-157f17bffbbd85a42fa16a2d7a38651d21a006c81233b8debee3b768edda376b WatchSource:0}: Error finding container 157f17bffbbd85a42fa16a2d7a38651d21a006c81233b8debee3b768edda376b: Status 404 returned error can't find the container with id 157f17bffbbd85a42fa16a2d7a38651d21a006c81233b8debee3b768edda376b Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.329802 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:51 crc kubenswrapper[4804]: E0217 13:27:51.330427 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:51.830409845 +0000 UTC m=+145.941829192 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.431712 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:51 crc kubenswrapper[4804]: E0217 13:27:51.431881 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:51.931853903 +0000 UTC m=+146.043273250 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.434910 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.435416 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-17 13:22:50 +0000 UTC, rotation deadline is 2027-01-10 04:50:41.090416388 +0000 UTC Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.435457 4804 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7839h22m49.654962545s for next certificate rotation Feb 17 13:27:51 crc kubenswrapper[4804]: E0217 13:27:51.436782 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:51.936739176 +0000 UTC m=+146.048158513 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.447927 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5sp6x"] Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.554950 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh"] Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.555665 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:51 crc kubenswrapper[4804]: E0217 13:27:51.556092 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:52.056072323 +0000 UTC m=+146.167491660 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.556800 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-vwjsv"] Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.557016 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-v8xf8" podStartSLOduration=124.556996864 podStartE2EDuration="2m4.556996864s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:51.544660101 +0000 UTC m=+145.656079448" watchObservedRunningTime="2026-02-17 13:27:51.556996864 +0000 UTC m=+145.668416201" Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.566542 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-d6mxf"] Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.605161 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-w4nl5" podStartSLOduration=124.605143567 podStartE2EDuration="2m4.605143567s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:51.60194123 +0000 UTC m=+145.713360597" watchObservedRunningTime="2026-02-17 13:27:51.605143567 +0000 UTC m=+145.716562904" Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.658022 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:51 crc kubenswrapper[4804]: E0217 13:27:51.658661 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:52.158649499 +0000 UTC m=+146.270068836 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.680631 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-td8n5"] Feb 17 13:27:51 crc kubenswrapper[4804]: W0217 13:27:51.690671 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd4df830_6ec9_4f4d_860e_f97af3088371.slice/crio-36a6f6266cb43ae1b4df1c4437d84aa74e1b0d87cb0df59fa8f5005efed37226 WatchSource:0}: Error finding container 36a6f6266cb43ae1b4df1c4437d84aa74e1b0d87cb0df59fa8f5005efed37226: Status 404 returned error can't find the container with id 36a6f6266cb43ae1b4df1c4437d84aa74e1b0d87cb0df59fa8f5005efed37226 Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.699687 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-bpzqw" podStartSLOduration=124.699668213 podStartE2EDuration="2m4.699668213s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:51.695131481 +0000 UTC m=+145.806550828" watchObservedRunningTime="2026-02-17 13:27:51.699668213 +0000 UTC m=+145.811087550" Feb 17 13:27:51 crc kubenswrapper[4804]: W0217 13:27:51.762110 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e609565_a380_48f1_9b14_542a17c4ea50.slice/crio-a5a45f7c4eccd84da9dcfe9858c4f610d2fd4675825f75daef98e8d3787a75e8 WatchSource:0}: Error finding container a5a45f7c4eccd84da9dcfe9858c4f610d2fd4675825f75daef98e8d3787a75e8: Status 404 returned error can't find the container with id a5a45f7c4eccd84da9dcfe9858c4f610d2fd4675825f75daef98e8d3787a75e8 Feb 17 13:27:51 crc kubenswrapper[4804]: E0217 13:27:51.762229 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:52.262181077 +0000 UTC m=+146.373600414 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.762134 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.762820 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:51 crc kubenswrapper[4804]: E0217 13:27:51.764649 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:52.264630399 +0000 UTC m=+146.376049736 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.805358 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx"] Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.811122 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm"] Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.813947 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq"] Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.828675 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8"] Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.843958 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6k2g8"] Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.866551 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:51 crc kubenswrapper[4804]: E0217 13:27:51.866974 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:52.366955856 +0000 UTC m=+146.478375193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.870897 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-bbjwp"] Feb 17 13:27:51 crc kubenswrapper[4804]: W0217 13:27:51.889676 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ce6eded_da13_4bb7_a87d_71b87d0e7f06.slice/crio-8d3bbbb9c8ddaebadf3050ba63a4409fb724b92775f2af121beab0c80c2020a4 WatchSource:0}: Error finding container 8d3bbbb9c8ddaebadf3050ba63a4409fb724b92775f2af121beab0c80c2020a4: Status 404 returned error can't find the container with id 8d3bbbb9c8ddaebadf3050ba63a4409fb724b92775f2af121beab0c80c2020a4 Feb 17 13:27:51 crc kubenswrapper[4804]: W0217 13:27:51.890396 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6882836_eb39_412c_a0d6_4906c9be9b89.slice/crio-f8106f5d88fc33d09cd9b635c7548d409081e8d0575eabfd1196bb8ee25c8879 WatchSource:0}: Error finding container f8106f5d88fc33d09cd9b635c7548d409081e8d0575eabfd1196bb8ee25c8879: Status 404 returned error can't find the container with id f8106f5d88fc33d09cd9b635c7548d409081e8d0575eabfd1196bb8ee25c8879 Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.906191 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-82b8f"] Feb 17 13:27:51 crc kubenswrapper[4804]: W0217 13:27:51.931142 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd233b99_2205_4e95_ba04_232015517afb.slice/crio-c4ca83a20b333f19f7d43356e38f804b985ba421896d923dfc31a2eb98a1fdcd WatchSource:0}: Error finding container c4ca83a20b333f19f7d43356e38f804b985ba421896d923dfc31a2eb98a1fdcd: Status 404 returned error can't find the container with id c4ca83a20b333f19f7d43356e38f804b985ba421896d923dfc31a2eb98a1fdcd Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.964399 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq"] Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.968022 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:51 crc kubenswrapper[4804]: E0217 13:27:51.968362 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:52.468350743 +0000 UTC m=+146.579770080 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:51 crc kubenswrapper[4804]: I0217 13:27:51.996657 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" podStartSLOduration=123.99664103 podStartE2EDuration="2m3.99664103s" podCreationTimestamp="2026-02-17 13:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:51.995626986 +0000 UTC m=+146.107046323" watchObservedRunningTime="2026-02-17 13:27:51.99664103 +0000 UTC m=+146.108060367" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.055024 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4shqj" podStartSLOduration=125.054965234 podStartE2EDuration="2m5.054965234s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:52.037708766 +0000 UTC m=+146.149128103" watchObservedRunningTime="2026-02-17 13:27:52.054965234 +0000 UTC m=+146.166384571" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.069149 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:52 crc kubenswrapper[4804]: E0217 13:27:52.069319 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:52.569292974 +0000 UTC m=+146.680712311 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.071904 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:52 crc kubenswrapper[4804]: E0217 13:27:52.072410 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:52.572398968 +0000 UTC m=+146.683818305 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.172971 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:52 crc kubenswrapper[4804]: E0217 13:27:52.173252 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:52.673179104 +0000 UTC m=+146.784598441 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.173658 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:52 crc kubenswrapper[4804]: E0217 13:27:52.174091 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:52.674078794 +0000 UTC m=+146.785498131 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.174769 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4shqj" event={"ID":"6f8789cf-f788-4c81-9624-532aa823de1c","Type":"ContainerStarted","Data":"35c48e8fb2fa81f91413e79f65510c475665d951d8e3729fdc7ec2652d35c229"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.182585 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q46rz" event={"ID":"ea50fe9b-465a-448b-97db-a91822afb720","Type":"ContainerStarted","Data":"f43a05989a0cb018994c7ed93f6fcbb3287c333caa1e224a9b8ad854ef507a79"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.186056 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" event={"ID":"faba1ad1-aeda-412d-9824-36cc045bab86","Type":"ContainerStarted","Data":"d31353ab3fe48fbdb124d235032b5df5328407038b987a4788426c871ad7a301"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.186448 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.193666 4804 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-mqkcq container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.193716 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" podUID="faba1ad1-aeda-412d-9824-36cc045bab86" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: connect: connection refused" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.204368 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" podStartSLOduration=124.204355288 podStartE2EDuration="2m4.204355288s" podCreationTimestamp="2026-02-17 13:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:52.200701165 +0000 UTC m=+146.312120502" watchObservedRunningTime="2026-02-17 13:27:52.204355288 +0000 UTC m=+146.315774625" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.235676 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" event={"ID":"2ce6eded-da13-4bb7-a87d-71b87d0e7f06","Type":"ContainerStarted","Data":"8d3bbbb9c8ddaebadf3050ba63a4409fb724b92775f2af121beab0c80c2020a4"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.272560 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-5sp6x" event={"ID":"fe7f1d16-c21b-4de3-9f7b-d8bfe8f026c6","Type":"ContainerStarted","Data":"d09f27464dd0852c3eb4f37afc58c154c9fbb7700f52b9305aaf919abcafbf4a"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.274318 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.274379 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-ssf69" event={"ID":"1975682c-3445-467d-a0bd-a87b0ebf604b","Type":"ContainerStarted","Data":"880ed9d5f01451b75d2dc6ed95deb43e16cb8367642cbf6aeea953cb0fd0e13c"} Feb 17 13:27:52 crc kubenswrapper[4804]: E0217 13:27:52.274528 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:52.774500308 +0000 UTC m=+146.885919695 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.274858 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:52 crc kubenswrapper[4804]: E0217 13:27:52.275486 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:52.77546926 +0000 UTC m=+146.886888597 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.288889 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5" event={"ID":"70a41b60-6ec1-491d-9d3e-88758d91c45e","Type":"ContainerStarted","Data":"691b4b9bb1ec8153713740adfbef24a2316c8f6246fc43ff1dca064d36af3efd"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.295031 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.296027 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.297302 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-ssf69" podStartSLOduration=5.297291411 podStartE2EDuration="5.297291411s" podCreationTimestamp="2026-02-17 13:27:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:52.295463039 +0000 UTC m=+146.406882376" watchObservedRunningTime="2026-02-17 13:27:52.297291411 +0000 UTC m=+146.408710748" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.307609 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9t2lx" event={"ID":"4c3cd53a-4a82-449d-a270-b41853fa2c8a","Type":"ContainerStarted","Data":"277f5e06890a3b0a429ab21613a9d4cdb62546cc0ce8158d53f54e0de8d34994"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.317360 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm" event={"ID":"a6882836-eb39-412c-a0d6-4906c9be9b89","Type":"ContainerStarted","Data":"f8106f5d88fc33d09cd9b635c7548d409081e8d0575eabfd1196bb8ee25c8879"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.320010 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8" event={"ID":"3768c453-c58d-4768-9620-a202cbb8ccd8","Type":"ContainerStarted","Data":"76ca0c3a1f23c1bfd5829400e9cd39546fdd21f9b110e5b71897bf1278603129"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.320554 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.325046 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gffmb" event={"ID":"78dad77c-6d3f-43bc-93a3-ecd7dce378f3","Type":"ContainerStarted","Data":"ab20668d8bb8760f6b156b43021f78b8c090ccfac11a6030ff207b369f1b77ce"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.326504 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh" event={"ID":"bd4df830-6ec9-4f4d-860e-f97af3088371","Type":"ContainerStarted","Data":"36a6f6266cb43ae1b4df1c4437d84aa74e1b0d87cb0df59fa8f5005efed37226"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.328602 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xkvs" event={"ID":"2aaa28d2-1ca6-42c3-98f7-58c644a03061","Type":"ContainerStarted","Data":"98fc7dcdd6e644150fd44190a09bb7717d7abbaa177adba025a2f687c8e15714"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.328630 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xkvs" event={"ID":"2aaa28d2-1ca6-42c3-98f7-58c644a03061","Type":"ContainerStarted","Data":"347362c67773c71448c8813eb3a8b6fe9bbfca98f9e69e54c49cdc2ff253fd7e"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.330452 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tzzkm" event={"ID":"96df7f4c-b782-43e2-99b2-fa5219a59fd9","Type":"ContainerStarted","Data":"371959c237c07729c0ef1bbe1f63b6e5bc67fdacdfa6b555347a4ef23550900d"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.332079 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-kbpk6" event={"ID":"074c752f-fec1-4bd6-8773-596461ea288a","Type":"ContainerStarted","Data":"8ef76cd8399e73581929a3003d22c9543d350149ae189e6a2a726e30aa4305b4"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.361620 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-spfls" event={"ID":"17c8a131-fc0e-44b5-b374-846e6b2aeb1c","Type":"ContainerStarted","Data":"80d210a2a3c63131fb8282f24532253bbe1e87464049f2ccc07402908932ed1e"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.364730 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-82b8f" event={"ID":"28e27ee8-4574-4731-9324-031f9b3a209f","Type":"ContainerStarted","Data":"8a4c062f9db2dace0be040b4679b42ec47596225d8d9ba0a189393b5a3eab071"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.367145 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-bbjwp" event={"ID":"527ee9be-17be-4352-86fc-ef31bece3e86","Type":"ContainerStarted","Data":"752275f6e311d6fa52f9ec458f3dd3d978e12c3b722feb4df6e38efa3fb4bed2"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.368669 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" event={"ID":"fd882ba0-9d9f-4a38-8a48-ab4d146fff56","Type":"ContainerStarted","Data":"b981933933cb97d97cf932eb8b0d74b01f46746f21c79b9a7996eaaaeb1edc53"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.370971 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9t2lx" podStartSLOduration=125.370953368 podStartE2EDuration="2m5.370953368s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:52.341965968 +0000 UTC m=+146.453385305" watchObservedRunningTime="2026-02-17 13:27:52.370953368 +0000 UTC m=+146.482372705" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.375120 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tzzkm" podStartSLOduration=125.375101737 podStartE2EDuration="2m5.375101737s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:52.370066879 +0000 UTC m=+146.481486216" watchObservedRunningTime="2026-02-17 13:27:52.375101737 +0000 UTC m=+146.486521064" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.375812 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:52 crc kubenswrapper[4804]: E0217 13:27:52.377449 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:52.877420185 +0000 UTC m=+146.988839522 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.393059 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-td8n5" event={"ID":"8e609565-a380-48f1-9b14-542a17c4ea50","Type":"ContainerStarted","Data":"a5a45f7c4eccd84da9dcfe9858c4f610d2fd4675825f75daef98e8d3787a75e8"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.436470 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gffmb" podStartSLOduration=125.436452602 podStartE2EDuration="2m5.436452602s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:52.434183656 +0000 UTC m=+146.545602993" watchObservedRunningTime="2026-02-17 13:27:52.436452602 +0000 UTC m=+146.547871939" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.447004 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq" event={"ID":"81a4453c-e1e8-4624-a19b-f08ec4df93d7","Type":"ContainerStarted","Data":"dddf129af9a00ddc4f0969d0e5a291dc33022c3ebe91aea664d9b86e31058b0d"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.467974 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-kbpk6" podStartSLOduration=125.467951258 podStartE2EDuration="2m5.467951258s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:52.459779063 +0000 UTC m=+146.571198400" watchObservedRunningTime="2026-02-17 13:27:52.467951258 +0000 UTC m=+146.579370595" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.477981 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:52 crc kubenswrapper[4804]: E0217 13:27:52.479628 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:52.979613777 +0000 UTC m=+147.091033114 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.483880 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ttbrn" event={"ID":"4a5192d8-6708-48c6-b5e5-a081f89d3e66","Type":"ContainerStarted","Data":"61a388e9adc9a943377d688d27c2dc81dfad669670c5a3bf1b1a23df23c9b059"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.484016 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ttbrn" event={"ID":"4a5192d8-6708-48c6-b5e5-a081f89d3e66","Type":"ContainerStarted","Data":"157f17bffbbd85a42fa16a2d7a38651d21a006c81233b8debee3b768edda376b"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.486491 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-d6mxf" event={"ID":"c36c8731-9ee6-4ce6-8708-9e35e6112804","Type":"ContainerStarted","Data":"b2a2a565951f1940fab3a2e856fad4543924e55909a40759f9b387aadb12721f"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.490133 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx" event={"ID":"fd233b99-2205-4e95-ba04-232015517afb","Type":"ContainerStarted","Data":"c4ca83a20b333f19f7d43356e38f804b985ba421896d923dfc31a2eb98a1fdcd"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.493289 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d5pqr" event={"ID":"7cde5d02-8e0d-4b24-b7bc-b9365013d942","Type":"ContainerStarted","Data":"c924f7a2b708775eb6e228afa4b84edbaca3d7ee7f383bfe914fd277a0572a48"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.495581 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" event={"ID":"1d929eaa-807c-4809-8b8a-78c186418e71","Type":"ContainerStarted","Data":"e85184210391718c97e4b64df6d5ddb787255a643b112757a5bd89ac1f1c1ad2"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.496540 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.501553 4804 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-j8ggj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.501611 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" podUID="1d929eaa-807c-4809-8b8a-78c186418e71" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.516657 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-spfls" podStartSLOduration=124.516630698 podStartE2EDuration="2m4.516630698s" podCreationTimestamp="2026-02-17 13:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:52.50654076 +0000 UTC m=+146.617960097" watchObservedRunningTime="2026-02-17 13:27:52.516630698 +0000 UTC m=+146.628050035" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.521451 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4m4g" event={"ID":"6c98dfab-f166-4eb4-b385-724d6b9b9d7a","Type":"ContainerStarted","Data":"3763f04c0b5a11e4d2f859d573b3f3722479804f6fd8c70f1af703155e237371"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.521496 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4m4g" event={"ID":"6c98dfab-f166-4eb4-b385-724d6b9b9d7a","Type":"ContainerStarted","Data":"3d0d40e975c4ac627b43e84809a03150f409bc6614f8fb7f3a983ec339ab9823"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.532184 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" event={"ID":"81f879fe-7bd1-42d0-b026-80f901641a0b","Type":"ContainerStarted","Data":"50587ee39943ecdd522f24486a8c7642f0b337a62b3b45381bc4dd36af224c55"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.532277 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.532951 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" podStartSLOduration=125.532932743 podStartE2EDuration="2m5.532932743s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:52.530788832 +0000 UTC m=+146.642208169" watchObservedRunningTime="2026-02-17 13:27:52.532932743 +0000 UTC m=+146.644352070" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.536657 4804 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-bstw9 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.21:6443/healthz\": dial tcp 10.217.0.21:6443: connect: connection refused" start-of-body= Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.536707 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" podUID="81f879fe-7bd1-42d0-b026-80f901641a0b" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.21:6443/healthz\": dial tcp 10.217.0.21:6443: connect: connection refused" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.540768 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-mcszv" event={"ID":"bfb5c679-7c23-47fe-92b2-e035dceef1be","Type":"ContainerStarted","Data":"ddf59f8040b668f0f4c58d8bef4c204dcd3e4b466336f43e6ee9b0870efcec50"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.541532 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-mcszv" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.542567 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" event={"ID":"360a1093-b581-4806-9f88-3d3907bd4895","Type":"ContainerStarted","Data":"30624883332b7603a32f0b5c7350e0ff499a85473ddd4fbdc16ed765cd6f36f8"} Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.543294 4804 patch_prober.go:28] interesting pod/console-operator-58897d9998-mcszv container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/readyz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.543334 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-mcszv" podUID="bfb5c679-7c23-47fe-92b2-e035dceef1be" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/readyz\": dial tcp 10.217.0.16:8443: connect: connection refused" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.549863 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zcsqv" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.584576 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:52 crc kubenswrapper[4804]: E0217 13:27:52.585555 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:53.085540856 +0000 UTC m=+147.196960193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.585720 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-d5pqr" podStartSLOduration=125.585702651 podStartE2EDuration="2m5.585702651s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:52.557533628 +0000 UTC m=+146.668952965" watchObservedRunningTime="2026-02-17 13:27:52.585702651 +0000 UTC m=+146.697121988" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.585953 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ttbrn" podStartSLOduration=124.585947559 podStartE2EDuration="2m4.585947559s" podCreationTimestamp="2026-02-17 13:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:52.584064066 +0000 UTC m=+146.695483403" watchObservedRunningTime="2026-02-17 13:27:52.585947559 +0000 UTC m=+146.697366896" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.685954 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t4m4g" podStartSLOduration=124.685930909 podStartE2EDuration="2m4.685930909s" podCreationTimestamp="2026-02-17 13:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:52.667493271 +0000 UTC m=+146.778912608" watchObservedRunningTime="2026-02-17 13:27:52.685930909 +0000 UTC m=+146.797350246" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.686685 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:52 crc kubenswrapper[4804]: E0217 13:27:52.689643 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:53.189605181 +0000 UTC m=+147.301024518 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.715639 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" podStartSLOduration=125.715620883 podStartE2EDuration="2m5.715620883s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:52.715261911 +0000 UTC m=+146.826681258" watchObservedRunningTime="2026-02-17 13:27:52.715620883 +0000 UTC m=+146.827040230" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.747916 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-mcszv" podStartSLOduration=125.747896805 podStartE2EDuration="2m5.747896805s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:52.741181919 +0000 UTC m=+146.852601256" watchObservedRunningTime="2026-02-17 13:27:52.747896805 +0000 UTC m=+146.859316142" Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.788340 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:52 crc kubenswrapper[4804]: E0217 13:27:52.788783 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:53.288765943 +0000 UTC m=+147.400185280 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.889514 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:52 crc kubenswrapper[4804]: E0217 13:27:52.889832 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:53.389819278 +0000 UTC m=+147.501238615 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.990285 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:52 crc kubenswrapper[4804]: E0217 13:27:52.990457 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:53.490422238 +0000 UTC m=+147.601841575 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:52 crc kubenswrapper[4804]: I0217 13:27:52.990569 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:52 crc kubenswrapper[4804]: E0217 13:27:52.990979 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:53.490968116 +0000 UTC m=+147.602387533 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.068178 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.071277 4804 patch_prober.go:28] interesting pod/router-default-5444994796-kbpk6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 13:27:53 crc kubenswrapper[4804]: [-]has-synced failed: reason withheld Feb 17 13:27:53 crc kubenswrapper[4804]: [+]process-running ok Feb 17 13:27:53 crc kubenswrapper[4804]: healthz check failed Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.071332 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kbpk6" podUID="074c752f-fec1-4bd6-8773-596461ea288a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.091679 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:53 crc kubenswrapper[4804]: E0217 13:27:53.092259 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:53.592236868 +0000 UTC m=+147.703656235 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.193638 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:53 crc kubenswrapper[4804]: E0217 13:27:53.194092 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:53.694069989 +0000 UTC m=+147.805489396 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.294725 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:53 crc kubenswrapper[4804]: E0217 13:27:53.295180 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:53.795161475 +0000 UTC m=+147.906580812 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.400497 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:53 crc kubenswrapper[4804]: E0217 13:27:53.400889 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:53.900868436 +0000 UTC m=+148.012287773 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.505775 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:53 crc kubenswrapper[4804]: E0217 13:27:53.506259 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:54.006217595 +0000 UTC m=+148.117636932 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.506433 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:53 crc kubenswrapper[4804]: E0217 13:27:53.506900 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:54.006890637 +0000 UTC m=+148.118309974 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.549614 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh" event={"ID":"bd4df830-6ec9-4f4d-860e-f97af3088371","Type":"ContainerStarted","Data":"2a9ec8b1b537fc993084160d692e82780a76bdb11a34d891cccbf3f8dac45031"} Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.552017 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq" event={"ID":"81a4453c-e1e8-4624-a19b-f08ec4df93d7","Type":"ContainerStarted","Data":"8dfcf289f6b49c93fb0be9c6c8194c60cc7412daf6fe6ff845b21d0f07db7852"} Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.552213 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.553864 4804 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-vmmzq container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.553903 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq" podUID="81a4453c-e1e8-4624-a19b-f08ec4df93d7" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.555141 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-46w22" event={"ID":"3ea797e4-54e0-4063-8d2b-647f6686e2a8","Type":"ContainerStarted","Data":"79a0541efdfdf50b3866ac8b0b6206b325d36bec38c322976e1c15ffbaf6838f"} Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.555168 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-46w22" event={"ID":"3ea797e4-54e0-4063-8d2b-647f6686e2a8","Type":"ContainerStarted","Data":"a23b15fab9f56fb7e41335504d13cb65034f5191177ba08a44ab7be11ffffa97"} Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.560022 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-td8n5" event={"ID":"8e609565-a380-48f1-9b14-542a17c4ea50","Type":"ContainerStarted","Data":"ca7c5a82ceeba6e17d07d04b4b8b17de5363d890a0ac2ed1cd83b59d39695391"} Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.563937 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjbls" event={"ID":"9400eb64-255c-46c2-b6c6-39260e013e92","Type":"ContainerStarted","Data":"acb53343e24b64214eb8d63b479506b99fb10c57ac8fe080ef42bcca2d89b04d"} Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.565654 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" event={"ID":"360a1093-b581-4806-9f88-3d3907bd4895","Type":"ContainerStarted","Data":"2782ad929923ff294638d4cfc8dab0936f539b2948e17124ffd7769c4d9020c9"} Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.566013 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.567048 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-5sp6x" event={"ID":"fe7f1d16-c21b-4de3-9f7b-d8bfe8f026c6","Type":"ContainerStarted","Data":"56c53ed4105db510e93c67b057053181df6ebbb6d8541ab870a20fa4dc8300ce"} Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.567286 4804 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-9d9jq container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" start-of-body= Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.567324 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" podUID="360a1093-b581-4806-9f88-3d3907bd4895" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.569316 4804 generic.go:334] "Generic (PLEG): container finished" podID="70a41b60-6ec1-491d-9d3e-88758d91c45e" containerID="691b4b9bb1ec8153713740adfbef24a2316c8f6246fc43ff1dca064d36af3efd" exitCode=0 Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.569362 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5" event={"ID":"70a41b60-6ec1-491d-9d3e-88758d91c45e","Type":"ContainerDied","Data":"691b4b9bb1ec8153713740adfbef24a2316c8f6246fc43ff1dca064d36af3efd"} Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.569399 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5" event={"ID":"70a41b60-6ec1-491d-9d3e-88758d91c45e","Type":"ContainerStarted","Data":"7355a035aa7a473d1e3a624201cd7f56eee3e639ccdb8eaad53592345713c9d4"} Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.569581 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.570408 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-d6mxf" event={"ID":"c36c8731-9ee6-4ce6-8708-9e35e6112804","Type":"ContainerStarted","Data":"9daa60af0a56693576a356277be4e8c6d9f194e128d21b17f28caa1465248b95"} Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.574059 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8" event={"ID":"3768c453-c58d-4768-9620-a202cbb8ccd8","Type":"ContainerStarted","Data":"4162bfeb135a23379531aee533539dfb67782c33b11814b5fe4b4ead4443c227"} Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.576020 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xkvs" event={"ID":"2aaa28d2-1ca6-42c3-98f7-58c644a03061","Type":"ContainerStarted","Data":"b9519c5563ed579e5f6bcfbd075913b2603d9e5eee582aaa188ea4e5e16e7df8"} Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.576646 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xkvs" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.585027 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" event={"ID":"2ce6eded-da13-4bb7-a87d-71b87d0e7f06","Type":"ContainerStarted","Data":"249b979edb84b2aac9f6e8c09057b08c5a469e66d87734e7e060d0b2622d365d"} Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.585927 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.587142 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q46rz" event={"ID":"ea50fe9b-465a-448b-97db-a91822afb720","Type":"ContainerStarted","Data":"8bd913f3da6cc61714dd2cb60137b6fcbbd0af3fcff7addf3f4ccc60444dc47f"} Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.587814 4804 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6k2g8 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.587844 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" podUID="2ce6eded-da13-4bb7-a87d-71b87d0e7f06" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.588825 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-82b8f" event={"ID":"28e27ee8-4574-4731-9324-031f9b3a209f","Type":"ContainerStarted","Data":"46d82360d36ed3e50bb06f3f1654ff43325603c6cf8e180cc5b5ae8beaa2dcf0"} Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.590122 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-bbjwp" event={"ID":"527ee9be-17be-4352-86fc-ef31bece3e86","Type":"ContainerStarted","Data":"6ae1bb948ecea1ad7a23965654328918240b275b6050fcce0a3757ab066fb634"} Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.591109 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm" event={"ID":"a6882836-eb39-412c-a0d6-4906c9be9b89","Type":"ContainerStarted","Data":"c4015993bceb7bee92d209c40fe142b66dc9cece4f5b4d1e69297b741f615cb7"} Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.593481 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx" event={"ID":"fd233b99-2205-4e95-ba04-232015517afb","Type":"ContainerStarted","Data":"206ebf3fd6fd37435daf25f2fe623fe2cd8a8e6c9a6b6697d1335963dc1111f1"} Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.593512 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.594140 4804 patch_prober.go:28] interesting pod/console-operator-58897d9998-mcszv container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/readyz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.594185 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-mcszv" podUID="bfb5c679-7c23-47fe-92b2-e035dceef1be" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/readyz\": dial tcp 10.217.0.16:8443: connect: connection refused" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.594279 4804 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-bstw9 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.21:6443/healthz\": dial tcp 10.217.0.21:6443: connect: connection refused" start-of-body= Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.594340 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" podUID="81f879fe-7bd1-42d0-b026-80f901641a0b" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.21:6443/healthz\": dial tcp 10.217.0.21:6443: connect: connection refused" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.595030 4804 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-j8ggj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.595049 4804 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-645bx container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.595071 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" podUID="1d929eaa-807c-4809-8b8a-78c186418e71" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.595087 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx" podUID="fd233b99-2205-4e95-ba04-232015517afb" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.595679 4804 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-mqkcq container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.595712 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" podUID="faba1ad1-aeda-412d-9824-36cc045bab86" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: connect: connection refused" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.605061 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq" podStartSLOduration=125.605047885 podStartE2EDuration="2m5.605047885s" podCreationTimestamp="2026-02-17 13:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:53.583531604 +0000 UTC m=+147.694950941" watchObservedRunningTime="2026-02-17 13:27:53.605047885 +0000 UTC m=+147.716467222" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.607297 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-5sp6x" podStartSLOduration=125.60728992 podStartE2EDuration="2m5.60728992s" podCreationTimestamp="2026-02-17 13:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:53.604571609 +0000 UTC m=+147.715990946" watchObservedRunningTime="2026-02-17 13:27:53.60728992 +0000 UTC m=+147.718709257" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.607732 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:53 crc kubenswrapper[4804]: E0217 13:27:53.607898 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:54.107880849 +0000 UTC m=+148.219300186 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.607946 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:53 crc kubenswrapper[4804]: E0217 13:27:53.609399 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:54.109391211 +0000 UTC m=+148.220810548 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.629404 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8" podStartSLOduration=125.629386301 podStartE2EDuration="2m5.629386301s" podCreationTimestamp="2026-02-17 13:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:53.628683467 +0000 UTC m=+147.740102804" watchObservedRunningTime="2026-02-17 13:27:53.629386301 +0000 UTC m=+147.740805638" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.649494 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjbls" podStartSLOduration=126.649478684 podStartE2EDuration="2m6.649478684s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:53.647521337 +0000 UTC m=+147.758940674" watchObservedRunningTime="2026-02-17 13:27:53.649478684 +0000 UTC m=+147.760898021" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.671588 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5" podStartSLOduration=126.671569644 podStartE2EDuration="2m6.671569644s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:53.668880293 +0000 UTC m=+147.780299630" watchObservedRunningTime="2026-02-17 13:27:53.671569644 +0000 UTC m=+147.782988981" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.686050 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" podStartSLOduration=125.686031787 podStartE2EDuration="2m5.686031787s" podCreationTimestamp="2026-02-17 13:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:53.685103387 +0000 UTC m=+147.796522724" watchObservedRunningTime="2026-02-17 13:27:53.686031787 +0000 UTC m=+147.797451124" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.705935 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-46w22" podStartSLOduration=126.705921783 podStartE2EDuration="2m6.705921783s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:53.70402247 +0000 UTC m=+147.815441807" watchObservedRunningTime="2026-02-17 13:27:53.705921783 +0000 UTC m=+147.817341120" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.709237 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:53 crc kubenswrapper[4804]: E0217 13:27:53.709625 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:54.209583516 +0000 UTC m=+148.321002913 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.712815 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:53 crc kubenswrapper[4804]: E0217 13:27:53.716108 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:54.216099795 +0000 UTC m=+148.327519132 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.725994 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-td8n5" podStartSLOduration=7.725977625 podStartE2EDuration="7.725977625s" podCreationTimestamp="2026-02-17 13:27:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:53.72523287 +0000 UTC m=+147.836652237" watchObservedRunningTime="2026-02-17 13:27:53.725977625 +0000 UTC m=+147.837396962" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.777165 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xkvs" podStartSLOduration=125.777142709 podStartE2EDuration="2m5.777142709s" podCreationTimestamp="2026-02-17 13:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:53.749240265 +0000 UTC m=+147.860659612" watchObservedRunningTime="2026-02-17 13:27:53.777142709 +0000 UTC m=+147.888562046" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.779742 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" podStartSLOduration=125.779732716 podStartE2EDuration="2m5.779732716s" podCreationTimestamp="2026-02-17 13:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:53.776449446 +0000 UTC m=+147.887868783" watchObservedRunningTime="2026-02-17 13:27:53.779732716 +0000 UTC m=+147.891152053" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.795477 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q46rz" podStartSLOduration=125.795460923 podStartE2EDuration="2m5.795460923s" podCreationTimestamp="2026-02-17 13:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:53.793722825 +0000 UTC m=+147.905142162" watchObservedRunningTime="2026-02-17 13:27:53.795460923 +0000 UTC m=+147.906880260" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.810533 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-82b8f" podStartSLOduration=125.810495857 podStartE2EDuration="2m5.810495857s" podCreationTimestamp="2026-02-17 13:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:53.808361795 +0000 UTC m=+147.919781132" watchObservedRunningTime="2026-02-17 13:27:53.810495857 +0000 UTC m=+147.921915194" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.813933 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:53 crc kubenswrapper[4804]: E0217 13:27:53.814316 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:54.314301394 +0000 UTC m=+148.425720731 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.827010 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx" podStartSLOduration=125.826992449 podStartE2EDuration="2m5.826992449s" podCreationTimestamp="2026-02-17 13:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:53.824391212 +0000 UTC m=+147.935810549" watchObservedRunningTime="2026-02-17 13:27:53.826992449 +0000 UTC m=+147.938411786" Feb 17 13:27:53 crc kubenswrapper[4804]: I0217 13:27:53.915287 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:53 crc kubenswrapper[4804]: E0217 13:27:53.915751 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:54.415732251 +0000 UTC m=+148.527151648 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.016739 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:54 crc kubenswrapper[4804]: E0217 13:27:54.016959 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:54.516925121 +0000 UTC m=+148.628344458 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.018105 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:54 crc kubenswrapper[4804]: E0217 13:27:54.018440 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:54.518429472 +0000 UTC m=+148.629848909 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.070175 4804 patch_prober.go:28] interesting pod/router-default-5444994796-kbpk6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 13:27:54 crc kubenswrapper[4804]: [-]has-synced failed: reason withheld Feb 17 13:27:54 crc kubenswrapper[4804]: [+]process-running ok Feb 17 13:27:54 crc kubenswrapper[4804]: healthz check failed Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.070256 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kbpk6" podUID="074c752f-fec1-4bd6-8773-596461ea288a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.118962 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:54 crc kubenswrapper[4804]: E0217 13:27:54.119186 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:54.619147525 +0000 UTC m=+148.730566872 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.119328 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:54 crc kubenswrapper[4804]: E0217 13:27:54.119769 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:54.619757536 +0000 UTC m=+148.731176893 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.205557 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.206178 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.206419 4804 patch_prober.go:28] interesting pod/apiserver-76f77b778f-46w22 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.18:8443/livez\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.206461 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-46w22" podUID="3ea797e4-54e0-4063-8d2b-647f6686e2a8" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.18:8443/livez\": dial tcp 10.217.0.18:8443: connect: connection refused" Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.220683 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:54 crc kubenswrapper[4804]: E0217 13:27:54.220895 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:54.720878942 +0000 UTC m=+148.832298279 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.221058 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:54 crc kubenswrapper[4804]: E0217 13:27:54.221433 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:54.721423751 +0000 UTC m=+148.832843088 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.323005 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:54 crc kubenswrapper[4804]: E0217 13:27:54.323069 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:54.823050945 +0000 UTC m=+148.934470282 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.323382 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:54 crc kubenswrapper[4804]: E0217 13:27:54.323650 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:54.823642905 +0000 UTC m=+148.935062242 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.424777 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:54 crc kubenswrapper[4804]: E0217 13:27:54.424885 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:54.924863886 +0000 UTC m=+149.036283223 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.425003 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:54 crc kubenswrapper[4804]: E0217 13:27:54.425361 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:54.925353331 +0000 UTC m=+149.036772668 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.526123 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:54 crc kubenswrapper[4804]: E0217 13:27:54.526512 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.026497799 +0000 UTC m=+149.137917136 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.597826 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-bbjwp" event={"ID":"527ee9be-17be-4352-86fc-ef31bece3e86","Type":"ContainerStarted","Data":"0528d846578dc0dbefe2cbed846ca3fcbf47ebf9286a31e312c4ccdb2764afc8"} Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.600015 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-d6mxf" event={"ID":"c36c8731-9ee6-4ce6-8708-9e35e6112804","Type":"ContainerStarted","Data":"3edbca92b41e3bc0a5efddec609eaa025b38796ec47c30a71c9349c4082125a6"} Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.600161 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-d6mxf" Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.601864 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm" event={"ID":"a6882836-eb39-412c-a0d6-4906c9be9b89","Type":"ContainerStarted","Data":"0a375fd15e41373a403c853831490ec34e3e6b25fca0f36c84d69bc96f1e8ceb"} Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.604449 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh" event={"ID":"bd4df830-6ec9-4f4d-860e-f97af3088371","Type":"ContainerStarted","Data":"4a7592febf5222b4c6dcee1635095b02462340091529fb375dca19c64fff324e"} Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.605701 4804 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-645bx container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.605735 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx" podUID="fd233b99-2205-4e95-ba04-232015517afb" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.606971 4804 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-vmmzq container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.607002 4804 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-j8ggj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.607033 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" podUID="1d929eaa-807c-4809-8b8a-78c186418e71" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.607047 4804 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6k2g8 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.607092 4804 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-9d9jq container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" start-of-body= Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.607033 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq" podUID="81a4453c-e1e8-4624-a19b-f08ec4df93d7" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.607114 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" podUID="360a1093-b581-4806-9f88-3d3907bd4895" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.607095 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" podUID="2ce6eded-da13-4bb7-a87d-71b87d0e7f06" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.627948 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:54 crc kubenswrapper[4804]: E0217 13:27:54.628321 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.12830468 +0000 UTC m=+149.239724017 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.646249 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-bbjwp" podStartSLOduration=126.64622862 podStartE2EDuration="2m6.64622862s" podCreationTimestamp="2026-02-17 13:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:54.619601968 +0000 UTC m=+148.731021305" watchObservedRunningTime="2026-02-17 13:27:54.64622862 +0000 UTC m=+148.757647947" Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.646552 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4gnh" podStartSLOduration=127.64654649 podStartE2EDuration="2m7.64654649s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:54.643274012 +0000 UTC m=+148.754693349" watchObservedRunningTime="2026-02-17 13:27:54.64654649 +0000 UTC m=+148.757965837" Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.718660 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-trrpm" podStartSLOduration=126.718641596 podStartE2EDuration="2m6.718641596s" podCreationTimestamp="2026-02-17 13:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:54.676550996 +0000 UTC m=+148.787970333" watchObservedRunningTime="2026-02-17 13:27:54.718641596 +0000 UTC m=+148.830060933" Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.720945 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-d6mxf" podStartSLOduration=7.720933913 podStartE2EDuration="7.720933913s" podCreationTimestamp="2026-02-17 13:27:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:27:54.7167044 +0000 UTC m=+148.828123737" watchObservedRunningTime="2026-02-17 13:27:54.720933913 +0000 UTC m=+148.832353250" Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.728809 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:54 crc kubenswrapper[4804]: E0217 13:27:54.728994 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.228950241 +0000 UTC m=+149.340369578 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.731380 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:54 crc kubenswrapper[4804]: E0217 13:27:54.735182 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.235166439 +0000 UTC m=+149.346585776 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.833069 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:54 crc kubenswrapper[4804]: E0217 13:27:54.833321 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.333290816 +0000 UTC m=+149.444710153 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.833517 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:54 crc kubenswrapper[4804]: E0217 13:27:54.833897 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.333882965 +0000 UTC m=+149.445302302 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.934188 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:54 crc kubenswrapper[4804]: E0217 13:27:54.934338 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.43431301 +0000 UTC m=+149.545732347 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:54 crc kubenswrapper[4804]: I0217 13:27:54.934442 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:54 crc kubenswrapper[4804]: E0217 13:27:54.934727 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.434713633 +0000 UTC m=+149.546132960 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.034956 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:55 crc kubenswrapper[4804]: E0217 13:27:55.035116 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.535085535 +0000 UTC m=+149.646504872 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.036522 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:55 crc kubenswrapper[4804]: E0217 13:27:55.036821 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.536808883 +0000 UTC m=+149.648228220 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.071795 4804 patch_prober.go:28] interesting pod/router-default-5444994796-kbpk6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 13:27:55 crc kubenswrapper[4804]: [-]has-synced failed: reason withheld Feb 17 13:27:55 crc kubenswrapper[4804]: [+]process-running ok Feb 17 13:27:55 crc kubenswrapper[4804]: healthz check failed Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.071868 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kbpk6" podUID="074c752f-fec1-4bd6-8773-596461ea288a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.138115 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:55 crc kubenswrapper[4804]: E0217 13:27:55.138339 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.638302243 +0000 UTC m=+149.749721920 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.138401 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:55 crc kubenswrapper[4804]: E0217 13:27:55.138729 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.638713347 +0000 UTC m=+149.750132744 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.239827 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:55 crc kubenswrapper[4804]: E0217 13:27:55.240258 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.740227456 +0000 UTC m=+149.851646793 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.342086 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:55 crc kubenswrapper[4804]: E0217 13:27:55.342454 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.84243926 +0000 UTC m=+149.953858597 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.443695 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:55 crc kubenswrapper[4804]: E0217 13:27:55.443874 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.943839607 +0000 UTC m=+150.055258944 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.443926 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.443968 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.443997 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.444016 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.444048 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:55 crc kubenswrapper[4804]: E0217 13:27:55.444552 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:55.9445449 +0000 UTC m=+150.055964237 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.445044 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.454947 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.462812 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.478572 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.544758 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:55 crc kubenswrapper[4804]: E0217 13:27:55.544961 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:56.044931622 +0000 UTC m=+150.156350949 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.545046 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:55 crc kubenswrapper[4804]: E0217 13:27:55.545411 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:56.045394688 +0000 UTC m=+150.156814025 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.606270 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.620028 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.621971 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" event={"ID":"fd882ba0-9d9f-4a38-8a48-ab4d146fff56","Type":"ContainerStarted","Data":"99e6fa30f665a16ca4fdadb770e80b0a0dce06c141f18d2af61b8e64abe50477"} Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.622988 4804 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6k2g8 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.623036 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" podUID="2ce6eded-da13-4bb7-a87d-71b87d0e7f06" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.640143 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.640570 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vmmzq" Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.648126 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:55 crc kubenswrapper[4804]: E0217 13:27:55.648537 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:56.148520942 +0000 UTC m=+150.259940279 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.750642 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:55 crc kubenswrapper[4804]: E0217 13:27:55.752264 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:56.252243906 +0000 UTC m=+150.363663273 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.836738 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.836795 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.852241 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:55 crc kubenswrapper[4804]: E0217 13:27:55.852614 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:56.352580638 +0000 UTC m=+150.463999985 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.852697 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:55 crc kubenswrapper[4804]: E0217 13:27:55.853088 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:56.353076724 +0000 UTC m=+150.464496111 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.953446 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:55 crc kubenswrapper[4804]: E0217 13:27:55.953653 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:56.453623012 +0000 UTC m=+150.565042349 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:55 crc kubenswrapper[4804]: I0217 13:27:55.953780 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:55 crc kubenswrapper[4804]: E0217 13:27:55.954165 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:56.454149839 +0000 UTC m=+150.565569176 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.057931 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:56 crc kubenswrapper[4804]: E0217 13:27:56.058258 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:56.558239546 +0000 UTC m=+150.669658883 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.062934 4804 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-b8qc5 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.063301 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5" podUID="70a41b60-6ec1-491d-9d3e-88758d91c45e" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.063660 4804 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-b8qc5 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.063686 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5" podUID="70a41b60-6ec1-491d-9d3e-88758d91c45e" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.073578 4804 patch_prober.go:28] interesting pod/router-default-5444994796-kbpk6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 13:27:56 crc kubenswrapper[4804]: [-]has-synced failed: reason withheld Feb 17 13:27:56 crc kubenswrapper[4804]: [+]process-running ok Feb 17 13:27:56 crc kubenswrapper[4804]: healthz check failed Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.073643 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kbpk6" podUID="074c752f-fec1-4bd6-8773-596461ea288a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.160155 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:56 crc kubenswrapper[4804]: E0217 13:27:56.160538 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:56.660520533 +0000 UTC m=+150.771939860 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.260635 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:56 crc kubenswrapper[4804]: E0217 13:27:56.260905 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:56.760891464 +0000 UTC m=+150.872310801 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.361873 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:56 crc kubenswrapper[4804]: E0217 13:27:56.362217 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:56.862190468 +0000 UTC m=+150.973609795 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.462546 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:56 crc kubenswrapper[4804]: E0217 13:27:56.462741 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:56.962712595 +0000 UTC m=+151.074131932 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.462989 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:56 crc kubenswrapper[4804]: E0217 13:27:56.463328 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:56.963316824 +0000 UTC m=+151.074736161 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.563991 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:56 crc kubenswrapper[4804]: E0217 13:27:56.564351 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:57.064336368 +0000 UTC m=+151.175755695 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.624234 4804 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-9d9jq container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.624284 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" podUID="360a1093-b581-4806-9f88-3d3907bd4895" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.38:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.634102 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"9773c5fa71245a921a7c993b56e54bfffe31e532ff0b9ea4ad398b93725f7e05"} Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.634169 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"4773b0effe440a524c8382bb47c37b2839d736dfb4ea26c7ae3a826a894deedc"} Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.634361 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.636533 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"6db35b0bde03ead0a5ecb051839cfb7dd6a87126d40d29d3f474c8ca1b1c4cee"} Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.636567 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"66b1dd170c2216410643352bb5d78bf689d4dc2fc85b0d37802bf28f239c9a34"} Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.638033 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"114c65613ac1b1c3fd98ddff99d9e68e0fbcf7f285e30ea7df070f3b81b69753"} Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.638063 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"15e64c1706dd44afc63fd2182ca86597d8d37b981fbf914ec0d02c0fb33adc8e"} Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.665822 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:56 crc kubenswrapper[4804]: E0217 13:27:56.666159 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:57.166144419 +0000 UTC m=+151.277563756 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.767345 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:56 crc kubenswrapper[4804]: E0217 13:27:56.767491 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:57.267473183 +0000 UTC m=+151.378892520 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.767612 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:56 crc kubenswrapper[4804]: E0217 13:27:56.768171 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:57.268162316 +0000 UTC m=+151.379581643 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.869055 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:56 crc kubenswrapper[4804]: E0217 13:27:56.869244 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:57.36921894 +0000 UTC m=+151.480638277 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.869270 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:56 crc kubenswrapper[4804]: E0217 13:27:56.869576 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:57.369568352 +0000 UTC m=+151.480987689 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:56 crc kubenswrapper[4804]: I0217 13:27:56.970251 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:56 crc kubenswrapper[4804]: E0217 13:27:56.970644 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:57.470610807 +0000 UTC m=+151.582030144 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.071367 4804 patch_prober.go:28] interesting pod/router-default-5444994796-kbpk6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 13:27:57 crc kubenswrapper[4804]: [-]has-synced failed: reason withheld Feb 17 13:27:57 crc kubenswrapper[4804]: [+]process-running ok Feb 17 13:27:57 crc kubenswrapper[4804]: healthz check failed Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.071457 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kbpk6" podUID="074c752f-fec1-4bd6-8773-596461ea288a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.072122 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:57 crc kubenswrapper[4804]: E0217 13:27:57.072477 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:57.572462159 +0000 UTC m=+151.683881556 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.173061 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:57 crc kubenswrapper[4804]: E0217 13:27:57.173307 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:57.673270895 +0000 UTC m=+151.784690232 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.173627 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:57 crc kubenswrapper[4804]: E0217 13:27:57.173945 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:57.673937148 +0000 UTC m=+151.785356485 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.275433 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:57 crc kubenswrapper[4804]: E0217 13:27:57.275647 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:57.775617963 +0000 UTC m=+151.887037300 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.275802 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:57 crc kubenswrapper[4804]: E0217 13:27:57.276097 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:57.776085389 +0000 UTC m=+151.887504726 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.342295 4804 patch_prober.go:28] interesting pod/console-f9d7485db-tz5vz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.342375 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-tz5vz" podUID="9eb6b4b9-9e2e-4f39-9df0-068cfea71701" containerName="console" probeResult="failure" output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.349271 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.349343 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.350585 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.350689 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.353027 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.359146 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.360456 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.377389 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:57 crc kubenswrapper[4804]: E0217 13:27:57.377556 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:57.877523257 +0000 UTC m=+151.988942614 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.377908 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:57 crc kubenswrapper[4804]: E0217 13:27:57.378232 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:57.87822136 +0000 UTC m=+151.989640747 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.478592 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.478798 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9d3918ab-cfeb-4e36-82eb-349dd3cf74bf-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9d3918ab-cfeb-4e36-82eb-349dd3cf74bf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.478941 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9d3918ab-cfeb-4e36-82eb-349dd3cf74bf-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9d3918ab-cfeb-4e36-82eb-349dd3cf74bf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 13:27:57 crc kubenswrapper[4804]: E0217 13:27:57.479076 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:57.979047228 +0000 UTC m=+152.090466565 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.580497 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9d3918ab-cfeb-4e36-82eb-349dd3cf74bf-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9d3918ab-cfeb-4e36-82eb-349dd3cf74bf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.580556 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.580607 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9d3918ab-cfeb-4e36-82eb-349dd3cf74bf-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9d3918ab-cfeb-4e36-82eb-349dd3cf74bf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.580772 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9d3918ab-cfeb-4e36-82eb-349dd3cf74bf-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9d3918ab-cfeb-4e36-82eb-349dd3cf74bf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 13:27:57 crc kubenswrapper[4804]: E0217 13:27:57.580949 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:58.08093226 +0000 UTC m=+152.192351597 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.584913 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-54w49"] Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.586078 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-54w49" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.591039 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.601844 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9d3918ab-cfeb-4e36-82eb-349dd3cf74bf-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9d3918ab-cfeb-4e36-82eb-349dd3cf74bf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.624469 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-54w49"] Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.632288 4804 patch_prober.go:28] interesting pod/downloads-7954f5f757-w4nl5 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.632351 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-w4nl5" podUID="4c36b00a-bd3f-424c-a67b-d828d782e60f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.632373 4804 patch_prober.go:28] interesting pod/downloads-7954f5f757-w4nl5 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.632424 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-w4nl5" podUID="4c36b00a-bd3f-424c-a67b-d828d782e60f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.670474 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.681457 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.681753 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.682034 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df-utilities\") pod \"community-operators-54w49\" (UID: \"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df\") " pod="openshift-marketplace/community-operators-54w49" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.682062 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df-catalog-content\") pod \"community-operators-54w49\" (UID: \"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df\") " pod="openshift-marketplace/community-operators-54w49" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.682087 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmjlm\" (UniqueName: \"kubernetes.io/projected/5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df-kube-api-access-rmjlm\") pod \"community-operators-54w49\" (UID: \"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df\") " pod="openshift-marketplace/community-operators-54w49" Feb 17 13:27:57 crc kubenswrapper[4804]: E0217 13:27:57.682234 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:58.182220203 +0000 UTC m=+152.293639540 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.783156 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.783588 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df-utilities\") pod \"community-operators-54w49\" (UID: \"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df\") " pod="openshift-marketplace/community-operators-54w49" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.783623 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df-catalog-content\") pod \"community-operators-54w49\" (UID: \"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df\") " pod="openshift-marketplace/community-operators-54w49" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.783667 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmjlm\" (UniqueName: \"kubernetes.io/projected/5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df-kube-api-access-rmjlm\") pod \"community-operators-54w49\" (UID: \"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df\") " pod="openshift-marketplace/community-operators-54w49" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.784869 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df-utilities\") pod \"community-operators-54w49\" (UID: \"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df\") " pod="openshift-marketplace/community-operators-54w49" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.785207 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df-catalog-content\") pod \"community-operators-54w49\" (UID: \"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df\") " pod="openshift-marketplace/community-operators-54w49" Feb 17 13:27:57 crc kubenswrapper[4804]: E0217 13:27:57.785444 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:58.28543312 +0000 UTC m=+152.396852457 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.785823 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hpw7w"] Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.786736 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hpw7w" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.789775 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.832111 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmjlm\" (UniqueName: \"kubernetes.io/projected/5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df-kube-api-access-rmjlm\") pod \"community-operators-54w49\" (UID: \"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df\") " pod="openshift-marketplace/community-operators-54w49" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.870614 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hpw7w"] Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.886767 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.887031 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6cwx\" (UniqueName: \"kubernetes.io/projected/cbda9f29-b199-4a42-8757-f5ecc90f0437-kube-api-access-g6cwx\") pod \"certified-operators-hpw7w\" (UID: \"cbda9f29-b199-4a42-8757-f5ecc90f0437\") " pod="openshift-marketplace/certified-operators-hpw7w" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.887073 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbda9f29-b199-4a42-8757-f5ecc90f0437-catalog-content\") pod \"certified-operators-hpw7w\" (UID: \"cbda9f29-b199-4a42-8757-f5ecc90f0437\") " pod="openshift-marketplace/certified-operators-hpw7w" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.887128 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbda9f29-b199-4a42-8757-f5ecc90f0437-utilities\") pod \"certified-operators-hpw7w\" (UID: \"cbda9f29-b199-4a42-8757-f5ecc90f0437\") " pod="openshift-marketplace/certified-operators-hpw7w" Feb 17 13:27:57 crc kubenswrapper[4804]: E0217 13:27:57.887280 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:58.387261211 +0000 UTC m=+152.498680548 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.900562 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-54w49" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.987947 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbda9f29-b199-4a42-8757-f5ecc90f0437-catalog-content\") pod \"certified-operators-hpw7w\" (UID: \"cbda9f29-b199-4a42-8757-f5ecc90f0437\") " pod="openshift-marketplace/certified-operators-hpw7w" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.988005 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.988032 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbda9f29-b199-4a42-8757-f5ecc90f0437-utilities\") pod \"certified-operators-hpw7w\" (UID: \"cbda9f29-b199-4a42-8757-f5ecc90f0437\") " pod="openshift-marketplace/certified-operators-hpw7w" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.988111 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6cwx\" (UniqueName: \"kubernetes.io/projected/cbda9f29-b199-4a42-8757-f5ecc90f0437-kube-api-access-g6cwx\") pod \"certified-operators-hpw7w\" (UID: \"cbda9f29-b199-4a42-8757-f5ecc90f0437\") " pod="openshift-marketplace/certified-operators-hpw7w" Feb 17 13:27:57 crc kubenswrapper[4804]: E0217 13:27:57.988438 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:58.488416559 +0000 UTC m=+152.599835906 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.988477 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbda9f29-b199-4a42-8757-f5ecc90f0437-catalog-content\") pod \"certified-operators-hpw7w\" (UID: \"cbda9f29-b199-4a42-8757-f5ecc90f0437\") " pod="openshift-marketplace/certified-operators-hpw7w" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.988517 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbda9f29-b199-4a42-8757-f5ecc90f0437-utilities\") pod \"certified-operators-hpw7w\" (UID: \"cbda9f29-b199-4a42-8757-f5ecc90f0437\") " pod="openshift-marketplace/certified-operators-hpw7w" Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.989954 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f9k56"] Feb 17 13:27:57 crc kubenswrapper[4804]: I0217 13:27:57.990863 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f9k56" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.012317 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.030233 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f9k56"] Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.049309 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6cwx\" (UniqueName: \"kubernetes.io/projected/cbda9f29-b199-4a42-8757-f5ecc90f0437-kube-api-access-g6cwx\") pod \"certified-operators-hpw7w\" (UID: \"cbda9f29-b199-4a42-8757-f5ecc90f0437\") " pod="openshift-marketplace/certified-operators-hpw7w" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.075368 4804 patch_prober.go:28] interesting pod/router-default-5444994796-kbpk6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 13:27:58 crc kubenswrapper[4804]: [-]has-synced failed: reason withheld Feb 17 13:27:58 crc kubenswrapper[4804]: [+]process-running ok Feb 17 13:27:58 crc kubenswrapper[4804]: healthz check failed Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.075424 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kbpk6" podUID="074c752f-fec1-4bd6-8773-596461ea288a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.089501 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.091090 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd3f4542-6055-4524-9e05-58b4c9a16e37-catalog-content\") pod \"community-operators-f9k56\" (UID: \"dd3f4542-6055-4524-9e05-58b4c9a16e37\") " pod="openshift-marketplace/community-operators-f9k56" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.091161 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf9xm\" (UniqueName: \"kubernetes.io/projected/dd3f4542-6055-4524-9e05-58b4c9a16e37-kube-api-access-nf9xm\") pod \"community-operators-f9k56\" (UID: \"dd3f4542-6055-4524-9e05-58b4c9a16e37\") " pod="openshift-marketplace/community-operators-f9k56" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.091233 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd3f4542-6055-4524-9e05-58b4c9a16e37-utilities\") pod \"community-operators-f9k56\" (UID: \"dd3f4542-6055-4524-9e05-58b4c9a16e37\") " pod="openshift-marketplace/community-operators-f9k56" Feb 17 13:27:58 crc kubenswrapper[4804]: E0217 13:27:58.091392 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:58.591363707 +0000 UTC m=+152.702783044 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.108875 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hpw7w" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.190673 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dfpnq"] Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.193642 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dfpnq" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.196971 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd3f4542-6055-4524-9e05-58b4c9a16e37-utilities\") pod \"community-operators-f9k56\" (UID: \"dd3f4542-6055-4524-9e05-58b4c9a16e37\") " pod="openshift-marketplace/community-operators-f9k56" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.197075 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.197142 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd3f4542-6055-4524-9e05-58b4c9a16e37-catalog-content\") pod \"community-operators-f9k56\" (UID: \"dd3f4542-6055-4524-9e05-58b4c9a16e37\") " pod="openshift-marketplace/community-operators-f9k56" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.197218 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf9xm\" (UniqueName: \"kubernetes.io/projected/dd3f4542-6055-4524-9e05-58b4c9a16e37-kube-api-access-nf9xm\") pod \"community-operators-f9k56\" (UID: \"dd3f4542-6055-4524-9e05-58b4c9a16e37\") " pod="openshift-marketplace/community-operators-f9k56" Feb 17 13:27:58 crc kubenswrapper[4804]: E0217 13:27:58.197497 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:58.697486472 +0000 UTC m=+152.808905809 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.201684 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd3f4542-6055-4524-9e05-58b4c9a16e37-utilities\") pod \"community-operators-f9k56\" (UID: \"dd3f4542-6055-4524-9e05-58b4c9a16e37\") " pod="openshift-marketplace/community-operators-f9k56" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.202024 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd3f4542-6055-4524-9e05-58b4c9a16e37-catalog-content\") pod \"community-operators-f9k56\" (UID: \"dd3f4542-6055-4524-9e05-58b4c9a16e37\") " pod="openshift-marketplace/community-operators-f9k56" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.223518 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dfpnq"] Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.254581 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf9xm\" (UniqueName: \"kubernetes.io/projected/dd3f4542-6055-4524-9e05-58b4c9a16e37-kube-api-access-nf9xm\") pod \"community-operators-f9k56\" (UID: \"dd3f4542-6055-4524-9e05-58b4c9a16e37\") " pod="openshift-marketplace/community-operators-f9k56" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.299830 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.300074 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh9rr\" (UniqueName: \"kubernetes.io/projected/af8f355f-84e5-49b0-83f4-b87ce7bb4015-kube-api-access-hh9rr\") pod \"certified-operators-dfpnq\" (UID: \"af8f355f-84e5-49b0-83f4-b87ce7bb4015\") " pod="openshift-marketplace/certified-operators-dfpnq" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.300164 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af8f355f-84e5-49b0-83f4-b87ce7bb4015-catalog-content\") pod \"certified-operators-dfpnq\" (UID: \"af8f355f-84e5-49b0-83f4-b87ce7bb4015\") " pod="openshift-marketplace/certified-operators-dfpnq" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.300182 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af8f355f-84e5-49b0-83f4-b87ce7bb4015-utilities\") pod \"certified-operators-dfpnq\" (UID: \"af8f355f-84e5-49b0-83f4-b87ce7bb4015\") " pod="openshift-marketplace/certified-operators-dfpnq" Feb 17 13:27:58 crc kubenswrapper[4804]: E0217 13:27:58.300296 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:58.800280885 +0000 UTC m=+152.911700222 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.324491 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f9k56" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.371995 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-54w49"] Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.402828 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh9rr\" (UniqueName: \"kubernetes.io/projected/af8f355f-84e5-49b0-83f4-b87ce7bb4015-kube-api-access-hh9rr\") pod \"certified-operators-dfpnq\" (UID: \"af8f355f-84e5-49b0-83f4-b87ce7bb4015\") " pod="openshift-marketplace/certified-operators-dfpnq" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.402863 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.402923 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af8f355f-84e5-49b0-83f4-b87ce7bb4015-catalog-content\") pod \"certified-operators-dfpnq\" (UID: \"af8f355f-84e5-49b0-83f4-b87ce7bb4015\") " pod="openshift-marketplace/certified-operators-dfpnq" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.402939 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af8f355f-84e5-49b0-83f4-b87ce7bb4015-utilities\") pod \"certified-operators-dfpnq\" (UID: \"af8f355f-84e5-49b0-83f4-b87ce7bb4015\") " pod="openshift-marketplace/certified-operators-dfpnq" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.403260 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af8f355f-84e5-49b0-83f4-b87ce7bb4015-utilities\") pod \"certified-operators-dfpnq\" (UID: \"af8f355f-84e5-49b0-83f4-b87ce7bb4015\") " pod="openshift-marketplace/certified-operators-dfpnq" Feb 17 13:27:58 crc kubenswrapper[4804]: E0217 13:27:58.403682 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:58.903671998 +0000 UTC m=+153.015091335 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.403980 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af8f355f-84e5-49b0-83f4-b87ce7bb4015-catalog-content\") pod \"certified-operators-dfpnq\" (UID: \"af8f355f-84e5-49b0-83f4-b87ce7bb4015\") " pod="openshift-marketplace/certified-operators-dfpnq" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.468547 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh9rr\" (UniqueName: \"kubernetes.io/projected/af8f355f-84e5-49b0-83f4-b87ce7bb4015-kube-api-access-hh9rr\") pod \"certified-operators-dfpnq\" (UID: \"af8f355f-84e5-49b0-83f4-b87ce7bb4015\") " pod="openshift-marketplace/certified-operators-dfpnq" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.511842 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:58 crc kubenswrapper[4804]: E0217 13:27:58.512347 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:59.012326368 +0000 UTC m=+153.123745715 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.562577 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dfpnq" Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.613363 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:58 crc kubenswrapper[4804]: E0217 13:27:58.613869 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:59.113857408 +0000 UTC m=+153.225276735 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.715532 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:58 crc kubenswrapper[4804]: E0217 13:27:58.715823 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:59.215809784 +0000 UTC m=+153.327229121 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.744574 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hpw7w"] Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.745976 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" event={"ID":"fd882ba0-9d9f-4a38-8a48-ab4d146fff56","Type":"ContainerStarted","Data":"84be3e5e7a29f1e7c2df4e9c48178fc69447e44a3a7ff354079c9086c2b1423d"} Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.746011 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" event={"ID":"fd882ba0-9d9f-4a38-8a48-ab4d146fff56","Type":"ContainerStarted","Data":"fe7424468529c03e2fae2003698ddb34eed0cf212f15f0eadffad5da9e45a22a"} Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.747242 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54w49" event={"ID":"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df","Type":"ContainerStarted","Data":"14bd0e0c6146aca8722f654770d91415f769ddfe462bd310b48fc23e91722dce"} Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.748553 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9d3918ab-cfeb-4e36-82eb-349dd3cf74bf","Type":"ContainerStarted","Data":"0268853808d2c1a2c3d8e2668996471a68d66a841ce6ccf78ec063e7971f0d58"} Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.817112 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:58 crc kubenswrapper[4804]: E0217 13:27:58.817462 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:59.317449548 +0000 UTC m=+153.428868885 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.921294 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:58 crc kubenswrapper[4804]: E0217 13:27:58.922169 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 13:27:59.422143505 +0000 UTC m=+153.533562842 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:58 crc kubenswrapper[4804]: I0217 13:27:58.983408 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f9k56"] Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.023174 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:59 crc kubenswrapper[4804]: E0217 13:27:59.023592 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 13:27:59.523575582 +0000 UTC m=+153.634994929 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggf6k" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.042501 4804 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.061955 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b8qc5" Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.073382 4804 patch_prober.go:28] interesting pod/router-default-5444994796-kbpk6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 13:27:59 crc kubenswrapper[4804]: [-]has-synced failed: reason withheld Feb 17 13:27:59 crc kubenswrapper[4804]: [+]process-running ok Feb 17 13:27:59 crc kubenswrapper[4804]: healthz check failed Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.073432 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kbpk6" podUID="074c752f-fec1-4bd6-8773-596461ea288a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.083438 4804 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-17T13:27:59.042527877Z","Handler":null,"Name":""} Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.094557 4804 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.094594 4804 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.099699 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dfpnq"] Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.124588 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.156330 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.214298 4804 patch_prober.go:28] interesting pod/apiserver-76f77b778f-46w22 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 17 13:27:59 crc kubenswrapper[4804]: [+]log ok Feb 17 13:27:59 crc kubenswrapper[4804]: [+]etcd ok Feb 17 13:27:59 crc kubenswrapper[4804]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 17 13:27:59 crc kubenswrapper[4804]: [+]poststarthook/generic-apiserver-start-informers ok Feb 17 13:27:59 crc kubenswrapper[4804]: [+]poststarthook/max-in-flight-filter ok Feb 17 13:27:59 crc kubenswrapper[4804]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 17 13:27:59 crc kubenswrapper[4804]: [+]poststarthook/image.openshift.io-apiserver-caches ok Feb 17 13:27:59 crc kubenswrapper[4804]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Feb 17 13:27:59 crc kubenswrapper[4804]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Feb 17 13:27:59 crc kubenswrapper[4804]: [+]poststarthook/project.openshift.io-projectcache ok Feb 17 13:27:59 crc kubenswrapper[4804]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Feb 17 13:27:59 crc kubenswrapper[4804]: [+]poststarthook/openshift.io-startinformers ok Feb 17 13:27:59 crc kubenswrapper[4804]: [+]poststarthook/openshift.io-restmapperupdater ok Feb 17 13:27:59 crc kubenswrapper[4804]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 17 13:27:59 crc kubenswrapper[4804]: livez check failed Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.214353 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-46w22" podUID="3ea797e4-54e0-4063-8d2b-647f6686e2a8" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.226668 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.321367 4804 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.321678 4804 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.361273 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggf6k\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.554605 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-mcszv" Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.634080 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.761971 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpw7w" event={"ID":"cbda9f29-b199-4a42-8757-f5ecc90f0437","Type":"ContainerStarted","Data":"f8fddc3c1f1b98532bbecd6c7da5c2a2368e8ed8a3bd8f6f7983638879bf50a9"} Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.763564 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54w49" event={"ID":"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df","Type":"ContainerStarted","Data":"631e1502ce58a957ecc7f80d4be00f1ff4ee5647304073d92dd226e1e956e02c"} Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.764389 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9k56" event={"ID":"dd3f4542-6055-4524-9e05-58b4c9a16e37","Type":"ContainerStarted","Data":"21bf4e05af6fa23bdde7a029ebf7c31d1a22cc2791c5a01af78f87549037e881"} Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.770447 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fvtl6"] Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.771583 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fvtl6" Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.774307 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.775934 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9d3918ab-cfeb-4e36-82eb-349dd3cf74bf","Type":"ContainerStarted","Data":"7c78c7947559e8f76292ea42131dae6c0ad7eaf265131a245dfed7a7568f72f2"} Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.777854 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dfpnq" event={"ID":"af8f355f-84e5-49b0-83f4-b87ce7bb4015","Type":"ContainerStarted","Data":"be09cbde5111c6442fb7580667b29d0357b1495c50edff7352458e4b0ddab9db"} Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.788515 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fvtl6"] Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.838604 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a10f4e7-7906-43aa-98fb-e709a71a55d2-utilities\") pod \"redhat-marketplace-fvtl6\" (UID: \"6a10f4e7-7906-43aa-98fb-e709a71a55d2\") " pod="openshift-marketplace/redhat-marketplace-fvtl6" Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.838735 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdxzj\" (UniqueName: \"kubernetes.io/projected/6a10f4e7-7906-43aa-98fb-e709a71a55d2-kube-api-access-zdxzj\") pod \"redhat-marketplace-fvtl6\" (UID: \"6a10f4e7-7906-43aa-98fb-e709a71a55d2\") " pod="openshift-marketplace/redhat-marketplace-fvtl6" Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.838926 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a10f4e7-7906-43aa-98fb-e709a71a55d2-catalog-content\") pod \"redhat-marketplace-fvtl6\" (UID: \"6a10f4e7-7906-43aa-98fb-e709a71a55d2\") " pod="openshift-marketplace/redhat-marketplace-fvtl6" Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.842844 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ggf6k"] Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.939633 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a10f4e7-7906-43aa-98fb-e709a71a55d2-catalog-content\") pod \"redhat-marketplace-fvtl6\" (UID: \"6a10f4e7-7906-43aa-98fb-e709a71a55d2\") " pod="openshift-marketplace/redhat-marketplace-fvtl6" Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.939702 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a10f4e7-7906-43aa-98fb-e709a71a55d2-utilities\") pod \"redhat-marketplace-fvtl6\" (UID: \"6a10f4e7-7906-43aa-98fb-e709a71a55d2\") " pod="openshift-marketplace/redhat-marketplace-fvtl6" Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.939744 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdxzj\" (UniqueName: \"kubernetes.io/projected/6a10f4e7-7906-43aa-98fb-e709a71a55d2-kube-api-access-zdxzj\") pod \"redhat-marketplace-fvtl6\" (UID: \"6a10f4e7-7906-43aa-98fb-e709a71a55d2\") " pod="openshift-marketplace/redhat-marketplace-fvtl6" Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.940409 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a10f4e7-7906-43aa-98fb-e709a71a55d2-catalog-content\") pod \"redhat-marketplace-fvtl6\" (UID: \"6a10f4e7-7906-43aa-98fb-e709a71a55d2\") " pod="openshift-marketplace/redhat-marketplace-fvtl6" Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.940629 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a10f4e7-7906-43aa-98fb-e709a71a55d2-utilities\") pod \"redhat-marketplace-fvtl6\" (UID: \"6a10f4e7-7906-43aa-98fb-e709a71a55d2\") " pod="openshift-marketplace/redhat-marketplace-fvtl6" Feb 17 13:27:59 crc kubenswrapper[4804]: I0217 13:27:59.962306 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdxzj\" (UniqueName: \"kubernetes.io/projected/6a10f4e7-7906-43aa-98fb-e709a71a55d2-kube-api-access-zdxzj\") pod \"redhat-marketplace-fvtl6\" (UID: \"6a10f4e7-7906-43aa-98fb-e709a71a55d2\") " pod="openshift-marketplace/redhat-marketplace-fvtl6" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.010492 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.066954 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.071368 4804 patch_prober.go:28] interesting pod/router-default-5444994796-kbpk6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 13:28:00 crc kubenswrapper[4804]: [-]has-synced failed: reason withheld Feb 17 13:28:00 crc kubenswrapper[4804]: [+]process-running ok Feb 17 13:28:00 crc kubenswrapper[4804]: healthz check failed Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.071429 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kbpk6" podUID="074c752f-fec1-4bd6-8773-596461ea288a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.090322 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.100059 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.103950 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fvtl6" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.167060 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j44f8"] Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.168346 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j44f8" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.177621 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j44f8"] Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.245092 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsj25\" (UniqueName: \"kubernetes.io/projected/4627be0e-b7ba-4e46-820b-0ce1271ecacb-kube-api-access-rsj25\") pod \"redhat-marketplace-j44f8\" (UID: \"4627be0e-b7ba-4e46-820b-0ce1271ecacb\") " pod="openshift-marketplace/redhat-marketplace-j44f8" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.245216 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4627be0e-b7ba-4e46-820b-0ce1271ecacb-catalog-content\") pod \"redhat-marketplace-j44f8\" (UID: \"4627be0e-b7ba-4e46-820b-0ce1271ecacb\") " pod="openshift-marketplace/redhat-marketplace-j44f8" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.245243 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4627be0e-b7ba-4e46-820b-0ce1271ecacb-utilities\") pod \"redhat-marketplace-j44f8\" (UID: \"4627be0e-b7ba-4e46-820b-0ce1271ecacb\") " pod="openshift-marketplace/redhat-marketplace-j44f8" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.347223 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4627be0e-b7ba-4e46-820b-0ce1271ecacb-catalog-content\") pod \"redhat-marketplace-j44f8\" (UID: \"4627be0e-b7ba-4e46-820b-0ce1271ecacb\") " pod="openshift-marketplace/redhat-marketplace-j44f8" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.347771 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4627be0e-b7ba-4e46-820b-0ce1271ecacb-utilities\") pod \"redhat-marketplace-j44f8\" (UID: \"4627be0e-b7ba-4e46-820b-0ce1271ecacb\") " pod="openshift-marketplace/redhat-marketplace-j44f8" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.347969 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsj25\" (UniqueName: \"kubernetes.io/projected/4627be0e-b7ba-4e46-820b-0ce1271ecacb-kube-api-access-rsj25\") pod \"redhat-marketplace-j44f8\" (UID: \"4627be0e-b7ba-4e46-820b-0ce1271ecacb\") " pod="openshift-marketplace/redhat-marketplace-j44f8" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.348131 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4627be0e-b7ba-4e46-820b-0ce1271ecacb-catalog-content\") pod \"redhat-marketplace-j44f8\" (UID: \"4627be0e-b7ba-4e46-820b-0ce1271ecacb\") " pod="openshift-marketplace/redhat-marketplace-j44f8" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.348561 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4627be0e-b7ba-4e46-820b-0ce1271ecacb-utilities\") pod \"redhat-marketplace-j44f8\" (UID: \"4627be0e-b7ba-4e46-820b-0ce1271ecacb\") " pod="openshift-marketplace/redhat-marketplace-j44f8" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.372862 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsj25\" (UniqueName: \"kubernetes.io/projected/4627be0e-b7ba-4e46-820b-0ce1271ecacb-kube-api-access-rsj25\") pod \"redhat-marketplace-j44f8\" (UID: \"4627be0e-b7ba-4e46-820b-0ce1271ecacb\") " pod="openshift-marketplace/redhat-marketplace-j44f8" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.455527 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fvtl6"] Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.457629 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" Feb 17 13:28:00 crc kubenswrapper[4804]: W0217 13:28:00.457820 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a10f4e7_7906_43aa_98fb_e709a71a55d2.slice/crio-122644669fc551cce79300f93153f1ee66ee7078e3af8dcd19bd62ec42ba0f74 WatchSource:0}: Error finding container 122644669fc551cce79300f93153f1ee66ee7078e3af8dcd19bd62ec42ba0f74: Status 404 returned error can't find the container with id 122644669fc551cce79300f93153f1ee66ee7078e3af8dcd19bd62ec42ba0f74 Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.504087 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j44f8" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.538704 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-645bx" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.559924 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.564664 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.570516 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.570668 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.571100 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.583261 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.628347 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9d9jq" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.653654 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/725ad1d2-2625-4eeb-b16b-7bc5ecb54c23-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"725ad1d2-2625-4eeb-b16b-7bc5ecb54c23\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.653764 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/725ad1d2-2625-4eeb-b16b-7bc5ecb54c23-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"725ad1d2-2625-4eeb-b16b-7bc5ecb54c23\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.754602 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/725ad1d2-2625-4eeb-b16b-7bc5ecb54c23-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"725ad1d2-2625-4eeb-b16b-7bc5ecb54c23\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.754745 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/725ad1d2-2625-4eeb-b16b-7bc5ecb54c23-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"725ad1d2-2625-4eeb-b16b-7bc5ecb54c23\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.754818 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/725ad1d2-2625-4eeb-b16b-7bc5ecb54c23-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"725ad1d2-2625-4eeb-b16b-7bc5ecb54c23\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.784628 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xf58f"] Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.785727 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xf58f" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.787058 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/725ad1d2-2625-4eeb-b16b-7bc5ecb54c23-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"725ad1d2-2625-4eeb-b16b-7bc5ecb54c23\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.788574 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.811780 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xf58f"] Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.828993 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" event={"ID":"b09fea83-e0d3-4a40-b186-8432c3fa7be0","Type":"ContainerStarted","Data":"9e9b6cc6c07f2c75051c25e6b523bc4176dc8b200979ccf83ba6ec3102993fc9"} Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.829058 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" event={"ID":"b09fea83-e0d3-4a40-b186-8432c3fa7be0","Type":"ContainerStarted","Data":"4dd741b3c38a0505bebb7c99e18c919af01e075e7767edd7ca2356d4e858351e"} Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.829907 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.831589 4804 generic.go:334] "Generic (PLEG): container finished" podID="dd3f4542-6055-4524-9e05-58b4c9a16e37" containerID="9da518d6a4ba94c30fc4e543aae3a6e806450f9d2bafc8157ce03ab22879d7ef" exitCode=0 Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.831642 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9k56" event={"ID":"dd3f4542-6055-4524-9e05-58b4c9a16e37","Type":"ContainerDied","Data":"9da518d6a4ba94c30fc4e543aae3a6e806450f9d2bafc8157ce03ab22879d7ef"} Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.833313 4804 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.835964 4804 generic.go:334] "Generic (PLEG): container finished" podID="af8f355f-84e5-49b0-83f4-b87ce7bb4015" containerID="0a5fa9448a9b147d71180506aad70bb2187e4381cb523e0918b556f39008479f" exitCode=0 Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.836027 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dfpnq" event={"ID":"af8f355f-84e5-49b0-83f4-b87ce7bb4015","Type":"ContainerDied","Data":"0a5fa9448a9b147d71180506aad70bb2187e4381cb523e0918b556f39008479f"} Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.856943 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm9gj\" (UniqueName: \"kubernetes.io/projected/4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5-kube-api-access-pm9gj\") pod \"redhat-operators-xf58f\" (UID: \"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5\") " pod="openshift-marketplace/redhat-operators-xf58f" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.857084 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5-catalog-content\") pod \"redhat-operators-xf58f\" (UID: \"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5\") " pod="openshift-marketplace/redhat-operators-xf58f" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.857139 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5-utilities\") pod \"redhat-operators-xf58f\" (UID: \"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5\") " pod="openshift-marketplace/redhat-operators-xf58f" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.877968 4804 generic.go:334] "Generic (PLEG): container finished" podID="cbda9f29-b199-4a42-8757-f5ecc90f0437" containerID="88fbb69755113b5be170561fa4a81a646f0de4078219f8a5bd52d347e075ff44" exitCode=0 Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.878103 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpw7w" event={"ID":"cbda9f29-b199-4a42-8757-f5ecc90f0437","Type":"ContainerDied","Data":"88fbb69755113b5be170561fa4a81a646f0de4078219f8a5bd52d347e075ff44"} Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.888350 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" event={"ID":"fd882ba0-9d9f-4a38-8a48-ab4d146fff56","Type":"ContainerStarted","Data":"447be96020e56044a9ec997c50432488c0c2f1e04113a47213b1169a9b9d44be"} Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.892654 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.893589 4804 generic.go:334] "Generic (PLEG): container finished" podID="3768c453-c58d-4768-9620-a202cbb8ccd8" containerID="4162bfeb135a23379531aee533539dfb67782c33b11814b5fe4b4ead4443c227" exitCode=0 Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.893609 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8" event={"ID":"3768c453-c58d-4768-9620-a202cbb8ccd8","Type":"ContainerDied","Data":"4162bfeb135a23379531aee533539dfb67782c33b11814b5fe4b4ead4443c227"} Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.897134 4804 generic.go:334] "Generic (PLEG): container finished" podID="5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df" containerID="631e1502ce58a957ecc7f80d4be00f1ff4ee5647304073d92dd226e1e956e02c" exitCode=0 Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.897352 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54w49" event={"ID":"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df","Type":"ContainerDied","Data":"631e1502ce58a957ecc7f80d4be00f1ff4ee5647304073d92dd226e1e956e02c"} Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.899232 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fvtl6" event={"ID":"6a10f4e7-7906-43aa-98fb-e709a71a55d2","Type":"ContainerStarted","Data":"122644669fc551cce79300f93153f1ee66ee7078e3af8dcd19bd62ec42ba0f74"} Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.902298 4804 generic.go:334] "Generic (PLEG): container finished" podID="9d3918ab-cfeb-4e36-82eb-349dd3cf74bf" containerID="7c78c7947559e8f76292ea42131dae6c0ad7eaf265131a245dfed7a7568f72f2" exitCode=0 Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.902897 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9d3918ab-cfeb-4e36-82eb-349dd3cf74bf","Type":"ContainerDied","Data":"7c78c7947559e8f76292ea42131dae6c0ad7eaf265131a245dfed7a7568f72f2"} Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.938381 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j44f8"] Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.938382 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" podStartSLOduration=133.93836647 podStartE2EDuration="2m13.93836647s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:28:00.921810705 +0000 UTC m=+155.033230052" watchObservedRunningTime="2026-02-17 13:28:00.93836647 +0000 UTC m=+155.049785807" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.958180 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5-catalog-content\") pod \"redhat-operators-xf58f\" (UID: \"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5\") " pod="openshift-marketplace/redhat-operators-xf58f" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.959347 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5-catalog-content\") pod \"redhat-operators-xf58f\" (UID: \"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5\") " pod="openshift-marketplace/redhat-operators-xf58f" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.959972 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5-utilities\") pod \"redhat-operators-xf58f\" (UID: \"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5\") " pod="openshift-marketplace/redhat-operators-xf58f" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.960106 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm9gj\" (UniqueName: \"kubernetes.io/projected/4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5-kube-api-access-pm9gj\") pod \"redhat-operators-xf58f\" (UID: \"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5\") " pod="openshift-marketplace/redhat-operators-xf58f" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.961167 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5-utilities\") pod \"redhat-operators-xf58f\" (UID: \"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5\") " pod="openshift-marketplace/redhat-operators-xf58f" Feb 17 13:28:00 crc kubenswrapper[4804]: I0217 13:28:00.980006 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm9gj\" (UniqueName: \"kubernetes.io/projected/4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5-kube-api-access-pm9gj\") pod \"redhat-operators-xf58f\" (UID: \"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5\") " pod="openshift-marketplace/redhat-operators-xf58f" Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.021687 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-vwjsv" podStartSLOduration=15.02167073 podStartE2EDuration="15.02167073s" podCreationTimestamp="2026-02-17 13:27:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:28:01.019477726 +0000 UTC m=+155.130897063" watchObservedRunningTime="2026-02-17 13:28:01.02167073 +0000 UTC m=+155.133090057" Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.070220 4804 patch_prober.go:28] interesting pod/router-default-5444994796-kbpk6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 13:28:01 crc kubenswrapper[4804]: [-]has-synced failed: reason withheld Feb 17 13:28:01 crc kubenswrapper[4804]: [+]process-running ok Feb 17 13:28:01 crc kubenswrapper[4804]: healthz check failed Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.070265 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kbpk6" podUID="074c752f-fec1-4bd6-8773-596461ea288a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.136689 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 17 13:28:01 crc kubenswrapper[4804]: W0217 13:28:01.145510 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod725ad1d2_2625_4eeb_b16b_7bc5ecb54c23.slice/crio-e630bb8a35e2d53701229b25e5c5a9539cf7d3f6ca3f79ebd2da67b9da3f9f92 WatchSource:0}: Error finding container e630bb8a35e2d53701229b25e5c5a9539cf7d3f6ca3f79ebd2da67b9da3f9f92: Status 404 returned error can't find the container with id e630bb8a35e2d53701229b25e5c5a9539cf7d3f6ca3f79ebd2da67b9da3f9f92 Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.167497 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c4fxk"] Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.170299 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c4fxk" Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.178375 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xf58f" Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.181258 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c4fxk"] Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.265459 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d715b9f-61c8-4851-a4b1-452f9f3ea8bd-utilities\") pod \"redhat-operators-c4fxk\" (UID: \"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd\") " pod="openshift-marketplace/redhat-operators-c4fxk" Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.265642 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmjk4\" (UniqueName: \"kubernetes.io/projected/3d715b9f-61c8-4851-a4b1-452f9f3ea8bd-kube-api-access-jmjk4\") pod \"redhat-operators-c4fxk\" (UID: \"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd\") " pod="openshift-marketplace/redhat-operators-c4fxk" Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.265803 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d715b9f-61c8-4851-a4b1-452f9f3ea8bd-catalog-content\") pod \"redhat-operators-c4fxk\" (UID: \"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd\") " pod="openshift-marketplace/redhat-operators-c4fxk" Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.367698 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d715b9f-61c8-4851-a4b1-452f9f3ea8bd-catalog-content\") pod \"redhat-operators-c4fxk\" (UID: \"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd\") " pod="openshift-marketplace/redhat-operators-c4fxk" Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.367765 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d715b9f-61c8-4851-a4b1-452f9f3ea8bd-utilities\") pod \"redhat-operators-c4fxk\" (UID: \"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd\") " pod="openshift-marketplace/redhat-operators-c4fxk" Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.367853 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmjk4\" (UniqueName: \"kubernetes.io/projected/3d715b9f-61c8-4851-a4b1-452f9f3ea8bd-kube-api-access-jmjk4\") pod \"redhat-operators-c4fxk\" (UID: \"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd\") " pod="openshift-marketplace/redhat-operators-c4fxk" Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.368175 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d715b9f-61c8-4851-a4b1-452f9f3ea8bd-catalog-content\") pod \"redhat-operators-c4fxk\" (UID: \"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd\") " pod="openshift-marketplace/redhat-operators-c4fxk" Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.368256 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d715b9f-61c8-4851-a4b1-452f9f3ea8bd-utilities\") pod \"redhat-operators-c4fxk\" (UID: \"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd\") " pod="openshift-marketplace/redhat-operators-c4fxk" Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.377422 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xf58f"] Feb 17 13:28:01 crc kubenswrapper[4804]: W0217 13:28:01.384373 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dbfd9db_3d17_44af_ab32_d2f7e7a1fab5.slice/crio-6c2639b1b465093d91b07ae1fd7b695d64615f297ec3d0a8c5e28adb5bb00161 WatchSource:0}: Error finding container 6c2639b1b465093d91b07ae1fd7b695d64615f297ec3d0a8c5e28adb5bb00161: Status 404 returned error can't find the container with id 6c2639b1b465093d91b07ae1fd7b695d64615f297ec3d0a8c5e28adb5bb00161 Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.388972 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmjk4\" (UniqueName: \"kubernetes.io/projected/3d715b9f-61c8-4851-a4b1-452f9f3ea8bd-kube-api-access-jmjk4\") pod \"redhat-operators-c4fxk\" (UID: \"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd\") " pod="openshift-marketplace/redhat-operators-c4fxk" Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.490559 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c4fxk" Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.722600 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c4fxk"] Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.912333 4804 generic.go:334] "Generic (PLEG): container finished" podID="3d715b9f-61c8-4851-a4b1-452f9f3ea8bd" containerID="0a2e7e7738570ae3eaeab55480ad74079b7d1d3eba43c78c570d3a883e196613" exitCode=0 Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.912413 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4fxk" event={"ID":"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd","Type":"ContainerDied","Data":"0a2e7e7738570ae3eaeab55480ad74079b7d1d3eba43c78c570d3a883e196613"} Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.912867 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4fxk" event={"ID":"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd","Type":"ContainerStarted","Data":"7e1b2fb29927815e4957ff56f7ae370566373e378aef77389a1de5a8d2809eef"} Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.914771 4804 generic.go:334] "Generic (PLEG): container finished" podID="4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5" containerID="f15bcf4d827a1a1b8fffb482f5c62a175b13f562fa7361a7f56418b0dbd2dd76" exitCode=0 Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.914881 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xf58f" event={"ID":"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5","Type":"ContainerDied","Data":"f15bcf4d827a1a1b8fffb482f5c62a175b13f562fa7361a7f56418b0dbd2dd76"} Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.914928 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xf58f" event={"ID":"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5","Type":"ContainerStarted","Data":"6c2639b1b465093d91b07ae1fd7b695d64615f297ec3d0a8c5e28adb5bb00161"} Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.921080 4804 generic.go:334] "Generic (PLEG): container finished" podID="6a10f4e7-7906-43aa-98fb-e709a71a55d2" containerID="eafbbbbbea590f0ceb80b9fa9d1c65902281f33937ef810444499dad62ff97c5" exitCode=0 Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.921186 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fvtl6" event={"ID":"6a10f4e7-7906-43aa-98fb-e709a71a55d2","Type":"ContainerDied","Data":"eafbbbbbea590f0ceb80b9fa9d1c65902281f33937ef810444499dad62ff97c5"} Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.924688 4804 generic.go:334] "Generic (PLEG): container finished" podID="4627be0e-b7ba-4e46-820b-0ce1271ecacb" containerID="fd63f395d9d2acc2a5229430110a217a86178b2333399d07e264a3b4cbc4fc4b" exitCode=0 Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.924973 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j44f8" event={"ID":"4627be0e-b7ba-4e46-820b-0ce1271ecacb","Type":"ContainerDied","Data":"fd63f395d9d2acc2a5229430110a217a86178b2333399d07e264a3b4cbc4fc4b"} Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.925019 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j44f8" event={"ID":"4627be0e-b7ba-4e46-820b-0ce1271ecacb","Type":"ContainerStarted","Data":"c5910c70e84a82abe005c7000c40085a9ab0598685cbc3225b9df0cad35f66af"} Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.928217 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"725ad1d2-2625-4eeb-b16b-7bc5ecb54c23","Type":"ContainerStarted","Data":"2d3cec7a2f95695d7c010f1a7f6b64ed68e16ed941a591c50fa5d0451060f1fe"} Feb 17 13:28:01 crc kubenswrapper[4804]: I0217 13:28:01.928251 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"725ad1d2-2625-4eeb-b16b-7bc5ecb54c23","Type":"ContainerStarted","Data":"e630bb8a35e2d53701229b25e5c5a9539cf7d3f6ca3f79ebd2da67b9da3f9f92"} Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.026364 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.026344592 podStartE2EDuration="2.026344592s" podCreationTimestamp="2026-02-17 13:28:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:28:02.006388723 +0000 UTC m=+156.117808050" watchObservedRunningTime="2026-02-17 13:28:02.026344592 +0000 UTC m=+156.137763929" Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.072436 4804 patch_prober.go:28] interesting pod/router-default-5444994796-kbpk6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 13:28:02 crc kubenswrapper[4804]: [-]has-synced failed: reason withheld Feb 17 13:28:02 crc kubenswrapper[4804]: [+]process-running ok Feb 17 13:28:02 crc kubenswrapper[4804]: healthz check failed Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.072493 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kbpk6" podUID="074c752f-fec1-4bd6-8773-596461ea288a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.208552 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8" Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.212338 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.283409 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9d3918ab-cfeb-4e36-82eb-349dd3cf74bf-kube-api-access\") pod \"9d3918ab-cfeb-4e36-82eb-349dd3cf74bf\" (UID: \"9d3918ab-cfeb-4e36-82eb-349dd3cf74bf\") " Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.283551 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3768c453-c58d-4768-9620-a202cbb8ccd8-secret-volume\") pod \"3768c453-c58d-4768-9620-a202cbb8ccd8\" (UID: \"3768c453-c58d-4768-9620-a202cbb8ccd8\") " Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.283611 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7z9lv\" (UniqueName: \"kubernetes.io/projected/3768c453-c58d-4768-9620-a202cbb8ccd8-kube-api-access-7z9lv\") pod \"3768c453-c58d-4768-9620-a202cbb8ccd8\" (UID: \"3768c453-c58d-4768-9620-a202cbb8ccd8\") " Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.283691 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3768c453-c58d-4768-9620-a202cbb8ccd8-config-volume\") pod \"3768c453-c58d-4768-9620-a202cbb8ccd8\" (UID: \"3768c453-c58d-4768-9620-a202cbb8ccd8\") " Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.283776 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9d3918ab-cfeb-4e36-82eb-349dd3cf74bf-kubelet-dir\") pod \"9d3918ab-cfeb-4e36-82eb-349dd3cf74bf\" (UID: \"9d3918ab-cfeb-4e36-82eb-349dd3cf74bf\") " Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.284265 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9d3918ab-cfeb-4e36-82eb-349dd3cf74bf-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9d3918ab-cfeb-4e36-82eb-349dd3cf74bf" (UID: "9d3918ab-cfeb-4e36-82eb-349dd3cf74bf"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.287469 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3768c453-c58d-4768-9620-a202cbb8ccd8-config-volume" (OuterVolumeSpecName: "config-volume") pod "3768c453-c58d-4768-9620-a202cbb8ccd8" (UID: "3768c453-c58d-4768-9620-a202cbb8ccd8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.292353 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d3918ab-cfeb-4e36-82eb-349dd3cf74bf-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9d3918ab-cfeb-4e36-82eb-349dd3cf74bf" (UID: "9d3918ab-cfeb-4e36-82eb-349dd3cf74bf"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.292391 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3768c453-c58d-4768-9620-a202cbb8ccd8-kube-api-access-7z9lv" (OuterVolumeSpecName: "kube-api-access-7z9lv") pod "3768c453-c58d-4768-9620-a202cbb8ccd8" (UID: "3768c453-c58d-4768-9620-a202cbb8ccd8"). InnerVolumeSpecName "kube-api-access-7z9lv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.295517 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3768c453-c58d-4768-9620-a202cbb8ccd8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3768c453-c58d-4768-9620-a202cbb8ccd8" (UID: "3768c453-c58d-4768-9620-a202cbb8ccd8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.387471 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9d3918ab-cfeb-4e36-82eb-349dd3cf74bf-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.387500 4804 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3768c453-c58d-4768-9620-a202cbb8ccd8-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.387513 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7z9lv\" (UniqueName: \"kubernetes.io/projected/3768c453-c58d-4768-9620-a202cbb8ccd8-kube-api-access-7z9lv\") on node \"crc\" DevicePath \"\"" Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.387523 4804 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3768c453-c58d-4768-9620-a202cbb8ccd8-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.387534 4804 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9d3918ab-cfeb-4e36-82eb-349dd3cf74bf-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.937961 4804 generic.go:334] "Generic (PLEG): container finished" podID="725ad1d2-2625-4eeb-b16b-7bc5ecb54c23" containerID="2d3cec7a2f95695d7c010f1a7f6b64ed68e16ed941a591c50fa5d0451060f1fe" exitCode=0 Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.938075 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"725ad1d2-2625-4eeb-b16b-7bc5ecb54c23","Type":"ContainerDied","Data":"2d3cec7a2f95695d7c010f1a7f6b64ed68e16ed941a591c50fa5d0451060f1fe"} Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.942755 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9d3918ab-cfeb-4e36-82eb-349dd3cf74bf","Type":"ContainerDied","Data":"0268853808d2c1a2c3d8e2668996471a68d66a841ce6ccf78ec063e7971f0d58"} Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.942784 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0268853808d2c1a2c3d8e2668996471a68d66a841ce6ccf78ec063e7971f0d58" Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.942803 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.947974 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8" event={"ID":"3768c453-c58d-4768-9620-a202cbb8ccd8","Type":"ContainerDied","Data":"76ca0c3a1f23c1bfd5829400e9cd39546fdd21f9b110e5b71897bf1278603129"} Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.948041 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76ca0c3a1f23c1bfd5829400e9cd39546fdd21f9b110e5b71897bf1278603129" Feb 17 13:28:02 crc kubenswrapper[4804]: I0217 13:28:02.948045 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8" Feb 17 13:28:03 crc kubenswrapper[4804]: I0217 13:28:03.070530 4804 patch_prober.go:28] interesting pod/router-default-5444994796-kbpk6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 13:28:03 crc kubenswrapper[4804]: [-]has-synced failed: reason withheld Feb 17 13:28:03 crc kubenswrapper[4804]: [+]process-running ok Feb 17 13:28:03 crc kubenswrapper[4804]: healthz check failed Feb 17 13:28:03 crc kubenswrapper[4804]: I0217 13:28:03.070643 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kbpk6" podUID="074c752f-fec1-4bd6-8773-596461ea288a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:28:04 crc kubenswrapper[4804]: I0217 13:28:04.069179 4804 patch_prober.go:28] interesting pod/router-default-5444994796-kbpk6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 13:28:04 crc kubenswrapper[4804]: [-]has-synced failed: reason withheld Feb 17 13:28:04 crc kubenswrapper[4804]: [+]process-running ok Feb 17 13:28:04 crc kubenswrapper[4804]: healthz check failed Feb 17 13:28:04 crc kubenswrapper[4804]: I0217 13:28:04.069244 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kbpk6" podUID="074c752f-fec1-4bd6-8773-596461ea288a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:28:04 crc kubenswrapper[4804]: I0217 13:28:04.211440 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:28:04 crc kubenswrapper[4804]: I0217 13:28:04.217615 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-46w22" Feb 17 13:28:04 crc kubenswrapper[4804]: I0217 13:28:04.515777 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 13:28:04 crc kubenswrapper[4804]: I0217 13:28:04.628607 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/725ad1d2-2625-4eeb-b16b-7bc5ecb54c23-kube-api-access\") pod \"725ad1d2-2625-4eeb-b16b-7bc5ecb54c23\" (UID: \"725ad1d2-2625-4eeb-b16b-7bc5ecb54c23\") " Feb 17 13:28:04 crc kubenswrapper[4804]: I0217 13:28:04.628755 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/725ad1d2-2625-4eeb-b16b-7bc5ecb54c23-kubelet-dir\") pod \"725ad1d2-2625-4eeb-b16b-7bc5ecb54c23\" (UID: \"725ad1d2-2625-4eeb-b16b-7bc5ecb54c23\") " Feb 17 13:28:04 crc kubenswrapper[4804]: I0217 13:28:04.629044 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/725ad1d2-2625-4eeb-b16b-7bc5ecb54c23-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "725ad1d2-2625-4eeb-b16b-7bc5ecb54c23" (UID: "725ad1d2-2625-4eeb-b16b-7bc5ecb54c23"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:28:04 crc kubenswrapper[4804]: I0217 13:28:04.660367 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/725ad1d2-2625-4eeb-b16b-7bc5ecb54c23-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "725ad1d2-2625-4eeb-b16b-7bc5ecb54c23" (UID: "725ad1d2-2625-4eeb-b16b-7bc5ecb54c23"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:28:04 crc kubenswrapper[4804]: I0217 13:28:04.730924 4804 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/725ad1d2-2625-4eeb-b16b-7bc5ecb54c23-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 13:28:04 crc kubenswrapper[4804]: I0217 13:28:04.730963 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/725ad1d2-2625-4eeb-b16b-7bc5ecb54c23-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 13:28:05 crc kubenswrapper[4804]: I0217 13:28:05.012929 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"725ad1d2-2625-4eeb-b16b-7bc5ecb54c23","Type":"ContainerDied","Data":"e630bb8a35e2d53701229b25e5c5a9539cf7d3f6ca3f79ebd2da67b9da3f9f92"} Feb 17 13:28:05 crc kubenswrapper[4804]: I0217 13:28:05.012977 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 13:28:05 crc kubenswrapper[4804]: I0217 13:28:05.012991 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e630bb8a35e2d53701229b25e5c5a9539cf7d3f6ca3f79ebd2da67b9da3f9f92" Feb 17 13:28:05 crc kubenswrapper[4804]: I0217 13:28:05.077502 4804 patch_prober.go:28] interesting pod/router-default-5444994796-kbpk6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 13:28:05 crc kubenswrapper[4804]: [-]has-synced failed: reason withheld Feb 17 13:28:05 crc kubenswrapper[4804]: [+]process-running ok Feb 17 13:28:05 crc kubenswrapper[4804]: healthz check failed Feb 17 13:28:05 crc kubenswrapper[4804]: I0217 13:28:05.083714 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kbpk6" podUID="074c752f-fec1-4bd6-8773-596461ea288a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:28:05 crc kubenswrapper[4804]: I0217 13:28:05.415092 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-d6mxf" Feb 17 13:28:06 crc kubenswrapper[4804]: I0217 13:28:06.068885 4804 patch_prober.go:28] interesting pod/router-default-5444994796-kbpk6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 13:28:06 crc kubenswrapper[4804]: [-]has-synced failed: reason withheld Feb 17 13:28:06 crc kubenswrapper[4804]: [+]process-running ok Feb 17 13:28:06 crc kubenswrapper[4804]: healthz check failed Feb 17 13:28:06 crc kubenswrapper[4804]: I0217 13:28:06.068958 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kbpk6" podUID="074c752f-fec1-4bd6-8773-596461ea288a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:28:07 crc kubenswrapper[4804]: I0217 13:28:07.069070 4804 patch_prober.go:28] interesting pod/router-default-5444994796-kbpk6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 13:28:07 crc kubenswrapper[4804]: [-]has-synced failed: reason withheld Feb 17 13:28:07 crc kubenswrapper[4804]: [+]process-running ok Feb 17 13:28:07 crc kubenswrapper[4804]: healthz check failed Feb 17 13:28:07 crc kubenswrapper[4804]: I0217 13:28:07.069183 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kbpk6" podUID="074c752f-fec1-4bd6-8773-596461ea288a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:28:07 crc kubenswrapper[4804]: I0217 13:28:07.339193 4804 patch_prober.go:28] interesting pod/console-f9d7485db-tz5vz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Feb 17 13:28:07 crc kubenswrapper[4804]: I0217 13:28:07.339288 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-tz5vz" podUID="9eb6b4b9-9e2e-4f39-9df0-068cfea71701" containerName="console" probeResult="failure" output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" Feb 17 13:28:07 crc kubenswrapper[4804]: I0217 13:28:07.638625 4804 patch_prober.go:28] interesting pod/downloads-7954f5f757-w4nl5 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Feb 17 13:28:07 crc kubenswrapper[4804]: I0217 13:28:07.638749 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-w4nl5" podUID="4c36b00a-bd3f-424c-a67b-d828d782e60f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Feb 17 13:28:07 crc kubenswrapper[4804]: I0217 13:28:07.638939 4804 patch_prober.go:28] interesting pod/downloads-7954f5f757-w4nl5 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Feb 17 13:28:07 crc kubenswrapper[4804]: I0217 13:28:07.638981 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-w4nl5" podUID="4c36b00a-bd3f-424c-a67b-d828d782e60f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Feb 17 13:28:08 crc kubenswrapper[4804]: I0217 13:28:08.069393 4804 patch_prober.go:28] interesting pod/router-default-5444994796-kbpk6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 13:28:08 crc kubenswrapper[4804]: [-]has-synced failed: reason withheld Feb 17 13:28:08 crc kubenswrapper[4804]: [+]process-running ok Feb 17 13:28:08 crc kubenswrapper[4804]: healthz check failed Feb 17 13:28:08 crc kubenswrapper[4804]: I0217 13:28:08.069758 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kbpk6" podUID="074c752f-fec1-4bd6-8773-596461ea288a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:28:09 crc kubenswrapper[4804]: I0217 13:28:09.068923 4804 patch_prober.go:28] interesting pod/router-default-5444994796-kbpk6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 13:28:09 crc kubenswrapper[4804]: [-]has-synced failed: reason withheld Feb 17 13:28:09 crc kubenswrapper[4804]: [+]process-running ok Feb 17 13:28:09 crc kubenswrapper[4804]: healthz check failed Feb 17 13:28:09 crc kubenswrapper[4804]: I0217 13:28:09.069333 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kbpk6" podUID="074c752f-fec1-4bd6-8773-596461ea288a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:28:09 crc kubenswrapper[4804]: I0217 13:28:09.925677 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs\") pod \"network-metrics-daemon-4jfgm\" (UID: \"e77722ba-d383-442c-b6dc-9983cf233257\") " pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:28:09 crc kubenswrapper[4804]: I0217 13:28:09.946237 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e77722ba-d383-442c-b6dc-9983cf233257-metrics-certs\") pod \"network-metrics-daemon-4jfgm\" (UID: \"e77722ba-d383-442c-b6dc-9983cf233257\") " pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:28:10 crc kubenswrapper[4804]: I0217 13:28:10.033807 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jfgm" Feb 17 13:28:10 crc kubenswrapper[4804]: I0217 13:28:10.069884 4804 patch_prober.go:28] interesting pod/router-default-5444994796-kbpk6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 13:28:10 crc kubenswrapper[4804]: [-]has-synced failed: reason withheld Feb 17 13:28:10 crc kubenswrapper[4804]: [+]process-running ok Feb 17 13:28:10 crc kubenswrapper[4804]: healthz check failed Feb 17 13:28:10 crc kubenswrapper[4804]: I0217 13:28:10.069955 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kbpk6" podUID="074c752f-fec1-4bd6-8773-596461ea288a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:28:11 crc kubenswrapper[4804]: I0217 13:28:11.069727 4804 patch_prober.go:28] interesting pod/router-default-5444994796-kbpk6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 13:28:11 crc kubenswrapper[4804]: [-]has-synced failed: reason withheld Feb 17 13:28:11 crc kubenswrapper[4804]: [+]process-running ok Feb 17 13:28:11 crc kubenswrapper[4804]: healthz check failed Feb 17 13:28:11 crc kubenswrapper[4804]: I0217 13:28:11.070005 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kbpk6" podUID="074c752f-fec1-4bd6-8773-596461ea288a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:28:12 crc kubenswrapper[4804]: I0217 13:28:12.069247 4804 patch_prober.go:28] interesting pod/router-default-5444994796-kbpk6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 13:28:12 crc kubenswrapper[4804]: [-]has-synced failed: reason withheld Feb 17 13:28:12 crc kubenswrapper[4804]: [+]process-running ok Feb 17 13:28:12 crc kubenswrapper[4804]: healthz check failed Feb 17 13:28:12 crc kubenswrapper[4804]: I0217 13:28:12.069376 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kbpk6" podUID="074c752f-fec1-4bd6-8773-596461ea288a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 13:28:13 crc kubenswrapper[4804]: I0217 13:28:13.074940 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:28:13 crc kubenswrapper[4804]: I0217 13:28:13.086025 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-kbpk6" Feb 17 13:28:15 crc kubenswrapper[4804]: I0217 13:28:15.439385 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-j8ggj"] Feb 17 13:28:15 crc kubenswrapper[4804]: I0217 13:28:15.440036 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" podUID="1d929eaa-807c-4809-8b8a-78c186418e71" containerName="controller-manager" containerID="cri-o://e85184210391718c97e4b64df6d5ddb787255a643b112757a5bd89ac1f1c1ad2" gracePeriod=30 Feb 17 13:28:15 crc kubenswrapper[4804]: I0217 13:28:15.445806 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq"] Feb 17 13:28:15 crc kubenswrapper[4804]: I0217 13:28:15.446077 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" podUID="faba1ad1-aeda-412d-9824-36cc045bab86" containerName="route-controller-manager" containerID="cri-o://d31353ab3fe48fbdb124d235032b5df5328407038b987a4788426c871ad7a301" gracePeriod=30 Feb 17 13:28:17 crc kubenswrapper[4804]: I0217 13:28:17.119738 4804 generic.go:334] "Generic (PLEG): container finished" podID="1d929eaa-807c-4809-8b8a-78c186418e71" containerID="e85184210391718c97e4b64df6d5ddb787255a643b112757a5bd89ac1f1c1ad2" exitCode=0 Feb 17 13:28:17 crc kubenswrapper[4804]: I0217 13:28:17.119809 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" event={"ID":"1d929eaa-807c-4809-8b8a-78c186418e71","Type":"ContainerDied","Data":"e85184210391718c97e4b64df6d5ddb787255a643b112757a5bd89ac1f1c1ad2"} Feb 17 13:28:17 crc kubenswrapper[4804]: I0217 13:28:17.121118 4804 generic.go:334] "Generic (PLEG): container finished" podID="faba1ad1-aeda-412d-9824-36cc045bab86" containerID="d31353ab3fe48fbdb124d235032b5df5328407038b987a4788426c871ad7a301" exitCode=0 Feb 17 13:28:17 crc kubenswrapper[4804]: I0217 13:28:17.121147 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" event={"ID":"faba1ad1-aeda-412d-9824-36cc045bab86","Type":"ContainerDied","Data":"d31353ab3fe48fbdb124d235032b5df5328407038b987a4788426c871ad7a301"} Feb 17 13:28:17 crc kubenswrapper[4804]: I0217 13:28:17.444182 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:28:17 crc kubenswrapper[4804]: I0217 13:28:17.451039 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:28:17 crc kubenswrapper[4804]: I0217 13:28:17.637132 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-w4nl5" Feb 17 13:28:19 crc kubenswrapper[4804]: I0217 13:28:19.642047 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:28:20 crc kubenswrapper[4804]: I0217 13:28:20.086822 4804 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-j8ggj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Feb 17 13:28:20 crc kubenswrapper[4804]: I0217 13:28:20.087154 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" podUID="1d929eaa-807c-4809-8b8a-78c186418e71" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" Feb 17 13:28:21 crc kubenswrapper[4804]: I0217 13:28:21.003471 4804 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-mqkcq container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: i/o timeout" start-of-body= Feb 17 13:28:21 crc kubenswrapper[4804]: I0217 13:28:21.003527 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" podUID="faba1ad1-aeda-412d-9824-36cc045bab86" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: i/o timeout" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.766236 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.802629 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8"] Feb 17 13:28:24 crc kubenswrapper[4804]: E0217 13:28:24.803014 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3768c453-c58d-4768-9620-a202cbb8ccd8" containerName="collect-profiles" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.803048 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="3768c453-c58d-4768-9620-a202cbb8ccd8" containerName="collect-profiles" Feb 17 13:28:24 crc kubenswrapper[4804]: E0217 13:28:24.803083 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faba1ad1-aeda-412d-9824-36cc045bab86" containerName="route-controller-manager" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.803100 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="faba1ad1-aeda-412d-9824-36cc045bab86" containerName="route-controller-manager" Feb 17 13:28:24 crc kubenswrapper[4804]: E0217 13:28:24.803123 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="725ad1d2-2625-4eeb-b16b-7bc5ecb54c23" containerName="pruner" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.803136 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="725ad1d2-2625-4eeb-b16b-7bc5ecb54c23" containerName="pruner" Feb 17 13:28:24 crc kubenswrapper[4804]: E0217 13:28:24.803158 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d3918ab-cfeb-4e36-82eb-349dd3cf74bf" containerName="pruner" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.803173 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d3918ab-cfeb-4e36-82eb-349dd3cf74bf" containerName="pruner" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.803442 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="faba1ad1-aeda-412d-9824-36cc045bab86" containerName="route-controller-manager" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.803477 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="3768c453-c58d-4768-9620-a202cbb8ccd8" containerName="collect-profiles" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.803505 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="725ad1d2-2625-4eeb-b16b-7bc5ecb54c23" containerName="pruner" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.803527 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d3918ab-cfeb-4e36-82eb-349dd3cf74bf" containerName="pruner" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.804160 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.824884 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8"] Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.847773 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/faba1ad1-aeda-412d-9824-36cc045bab86-serving-cert\") pod \"faba1ad1-aeda-412d-9824-36cc045bab86\" (UID: \"faba1ad1-aeda-412d-9824-36cc045bab86\") " Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.847838 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wz5b\" (UniqueName: \"kubernetes.io/projected/faba1ad1-aeda-412d-9824-36cc045bab86-kube-api-access-2wz5b\") pod \"faba1ad1-aeda-412d-9824-36cc045bab86\" (UID: \"faba1ad1-aeda-412d-9824-36cc045bab86\") " Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.847921 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/faba1ad1-aeda-412d-9824-36cc045bab86-config\") pod \"faba1ad1-aeda-412d-9824-36cc045bab86\" (UID: \"faba1ad1-aeda-412d-9824-36cc045bab86\") " Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.847982 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/faba1ad1-aeda-412d-9824-36cc045bab86-client-ca\") pod \"faba1ad1-aeda-412d-9824-36cc045bab86\" (UID: \"faba1ad1-aeda-412d-9824-36cc045bab86\") " Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.848137 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l74sp\" (UniqueName: \"kubernetes.io/projected/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-kube-api-access-l74sp\") pod \"route-controller-manager-6bb46c8d9c-hxxp8\" (UID: \"2cc5d152-9369-4574-ab6b-05d9d4c5afd7\") " pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.848231 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-client-ca\") pod \"route-controller-manager-6bb46c8d9c-hxxp8\" (UID: \"2cc5d152-9369-4574-ab6b-05d9d4c5afd7\") " pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.848247 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-config\") pod \"route-controller-manager-6bb46c8d9c-hxxp8\" (UID: \"2cc5d152-9369-4574-ab6b-05d9d4c5afd7\") " pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.848277 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-serving-cert\") pod \"route-controller-manager-6bb46c8d9c-hxxp8\" (UID: \"2cc5d152-9369-4574-ab6b-05d9d4c5afd7\") " pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.848836 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/faba1ad1-aeda-412d-9824-36cc045bab86-config" (OuterVolumeSpecName: "config") pod "faba1ad1-aeda-412d-9824-36cc045bab86" (UID: "faba1ad1-aeda-412d-9824-36cc045bab86"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.848912 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/faba1ad1-aeda-412d-9824-36cc045bab86-client-ca" (OuterVolumeSpecName: "client-ca") pod "faba1ad1-aeda-412d-9824-36cc045bab86" (UID: "faba1ad1-aeda-412d-9824-36cc045bab86"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.860286 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faba1ad1-aeda-412d-9824-36cc045bab86-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "faba1ad1-aeda-412d-9824-36cc045bab86" (UID: "faba1ad1-aeda-412d-9824-36cc045bab86"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.861639 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faba1ad1-aeda-412d-9824-36cc045bab86-kube-api-access-2wz5b" (OuterVolumeSpecName: "kube-api-access-2wz5b") pod "faba1ad1-aeda-412d-9824-36cc045bab86" (UID: "faba1ad1-aeda-412d-9824-36cc045bab86"). InnerVolumeSpecName "kube-api-access-2wz5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.950093 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-client-ca\") pod \"route-controller-manager-6bb46c8d9c-hxxp8\" (UID: \"2cc5d152-9369-4574-ab6b-05d9d4c5afd7\") " pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.950154 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-config\") pod \"route-controller-manager-6bb46c8d9c-hxxp8\" (UID: \"2cc5d152-9369-4574-ab6b-05d9d4c5afd7\") " pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.950221 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-serving-cert\") pod \"route-controller-manager-6bb46c8d9c-hxxp8\" (UID: \"2cc5d152-9369-4574-ab6b-05d9d4c5afd7\") " pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.950258 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l74sp\" (UniqueName: \"kubernetes.io/projected/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-kube-api-access-l74sp\") pod \"route-controller-manager-6bb46c8d9c-hxxp8\" (UID: \"2cc5d152-9369-4574-ab6b-05d9d4c5afd7\") " pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.950375 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/faba1ad1-aeda-412d-9824-36cc045bab86-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.950397 4804 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/faba1ad1-aeda-412d-9824-36cc045bab86-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.950415 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/faba1ad1-aeda-412d-9824-36cc045bab86-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.950432 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wz5b\" (UniqueName: \"kubernetes.io/projected/faba1ad1-aeda-412d-9824-36cc045bab86-kube-api-access-2wz5b\") on node \"crc\" DevicePath \"\"" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.952261 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-config\") pod \"route-controller-manager-6bb46c8d9c-hxxp8\" (UID: \"2cc5d152-9369-4574-ab6b-05d9d4c5afd7\") " pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.952554 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-client-ca\") pod \"route-controller-manager-6bb46c8d9c-hxxp8\" (UID: \"2cc5d152-9369-4574-ab6b-05d9d4c5afd7\") " pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.955873 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-serving-cert\") pod \"route-controller-manager-6bb46c8d9c-hxxp8\" (UID: \"2cc5d152-9369-4574-ab6b-05d9d4c5afd7\") " pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" Feb 17 13:28:24 crc kubenswrapper[4804]: I0217 13:28:24.974840 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l74sp\" (UniqueName: \"kubernetes.io/projected/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-kube-api-access-l74sp\") pod \"route-controller-manager-6bb46c8d9c-hxxp8\" (UID: \"2cc5d152-9369-4574-ab6b-05d9d4c5afd7\") " pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" Feb 17 13:28:25 crc kubenswrapper[4804]: I0217 13:28:25.132362 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" Feb 17 13:28:25 crc kubenswrapper[4804]: I0217 13:28:25.176925 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" event={"ID":"faba1ad1-aeda-412d-9824-36cc045bab86","Type":"ContainerDied","Data":"a4b6cbfefaf077ffe0f3e71671fde2907fe889b88fd4a0d27ee5e5b910c2832f"} Feb 17 13:28:25 crc kubenswrapper[4804]: I0217 13:28:25.176994 4804 scope.go:117] "RemoveContainer" containerID="d31353ab3fe48fbdb124d235032b5df5328407038b987a4788426c871ad7a301" Feb 17 13:28:25 crc kubenswrapper[4804]: I0217 13:28:25.177050 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq" Feb 17 13:28:25 crc kubenswrapper[4804]: I0217 13:28:25.223509 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq"] Feb 17 13:28:25 crc kubenswrapper[4804]: I0217 13:28:25.228135 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqkcq"] Feb 17 13:28:25 crc kubenswrapper[4804]: I0217 13:28:25.835360 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:28:25 crc kubenswrapper[4804]: I0217 13:28:25.835436 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:28:26 crc kubenswrapper[4804]: I0217 13:28:26.581967 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faba1ad1-aeda-412d-9824-36cc045bab86" path="/var/lib/kubelet/pods/faba1ad1-aeda-412d-9824-36cc045bab86/volumes" Feb 17 13:28:30 crc kubenswrapper[4804]: I0217 13:28:30.258749 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6xkvs" Feb 17 13:28:31 crc kubenswrapper[4804]: I0217 13:28:31.086508 4804 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-j8ggj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 13:28:31 crc kubenswrapper[4804]: I0217 13:28:31.086564 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" podUID="1d929eaa-807c-4809-8b8a-78c186418e71" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 13:28:35 crc kubenswrapper[4804]: I0217 13:28:35.365112 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 17 13:28:35 crc kubenswrapper[4804]: I0217 13:28:35.366797 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 13:28:35 crc kubenswrapper[4804]: I0217 13:28:35.367783 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 17 13:28:35 crc kubenswrapper[4804]: I0217 13:28:35.371099 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 17 13:28:35 crc kubenswrapper[4804]: I0217 13:28:35.377592 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 17 13:28:35 crc kubenswrapper[4804]: I0217 13:28:35.500331 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9dda4da8-c5ea-4c8a-8443-d7e31eba95af-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9dda4da8-c5ea-4c8a-8443-d7e31eba95af\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 13:28:35 crc kubenswrapper[4804]: I0217 13:28:35.500461 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9dda4da8-c5ea-4c8a-8443-d7e31eba95af-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9dda4da8-c5ea-4c8a-8443-d7e31eba95af\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 13:28:35 crc kubenswrapper[4804]: I0217 13:28:35.544541 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8"] Feb 17 13:28:35 crc kubenswrapper[4804]: I0217 13:28:35.602118 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9dda4da8-c5ea-4c8a-8443-d7e31eba95af-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9dda4da8-c5ea-4c8a-8443-d7e31eba95af\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 13:28:35 crc kubenswrapper[4804]: I0217 13:28:35.602228 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9dda4da8-c5ea-4c8a-8443-d7e31eba95af-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9dda4da8-c5ea-4c8a-8443-d7e31eba95af\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 13:28:35 crc kubenswrapper[4804]: I0217 13:28:35.602334 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9dda4da8-c5ea-4c8a-8443-d7e31eba95af-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9dda4da8-c5ea-4c8a-8443-d7e31eba95af\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 13:28:35 crc kubenswrapper[4804]: I0217 13:28:35.610703 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4jfgm"] Feb 17 13:28:35 crc kubenswrapper[4804]: I0217 13:28:35.626090 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9dda4da8-c5ea-4c8a-8443-d7e31eba95af-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9dda4da8-c5ea-4c8a-8443-d7e31eba95af\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 13:28:35 crc kubenswrapper[4804]: I0217 13:28:35.692616 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 13:28:36 crc kubenswrapper[4804]: I0217 13:28:36.155911 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 13:28:40 crc kubenswrapper[4804]: E0217 13:28:40.301689 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 17 13:28:40 crc kubenswrapper[4804]: E0217 13:28:40.303006 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g6cwx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-hpw7w_openshift-marketplace(cbda9f29-b199-4a42-8757-f5ecc90f0437): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 13:28:40 crc kubenswrapper[4804]: E0217 13:28:40.304413 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-hpw7w" podUID="cbda9f29-b199-4a42-8757-f5ecc90f0437" Feb 17 13:28:40 crc kubenswrapper[4804]: I0217 13:28:40.946051 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 17 13:28:40 crc kubenswrapper[4804]: I0217 13:28:40.946878 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 13:28:40 crc kubenswrapper[4804]: I0217 13:28:40.958006 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 17 13:28:41 crc kubenswrapper[4804]: I0217 13:28:41.071864 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c7ffc91-beb4-48c9-bd6a-3432eb40cb18-kubelet-dir\") pod \"installer-9-crc\" (UID: \"1c7ffc91-beb4-48c9-bd6a-3432eb40cb18\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 13:28:41 crc kubenswrapper[4804]: I0217 13:28:41.072221 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c7ffc91-beb4-48c9-bd6a-3432eb40cb18-kube-api-access\") pod \"installer-9-crc\" (UID: \"1c7ffc91-beb4-48c9-bd6a-3432eb40cb18\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 13:28:41 crc kubenswrapper[4804]: I0217 13:28:41.072272 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1c7ffc91-beb4-48c9-bd6a-3432eb40cb18-var-lock\") pod \"installer-9-crc\" (UID: \"1c7ffc91-beb4-48c9-bd6a-3432eb40cb18\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 13:28:41 crc kubenswrapper[4804]: I0217 13:28:41.086661 4804 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-j8ggj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 13:28:41 crc kubenswrapper[4804]: I0217 13:28:41.086799 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" podUID="1d929eaa-807c-4809-8b8a-78c186418e71" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 13:28:41 crc kubenswrapper[4804]: I0217 13:28:41.173302 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c7ffc91-beb4-48c9-bd6a-3432eb40cb18-kube-api-access\") pod \"installer-9-crc\" (UID: \"1c7ffc91-beb4-48c9-bd6a-3432eb40cb18\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 13:28:41 crc kubenswrapper[4804]: I0217 13:28:41.173398 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1c7ffc91-beb4-48c9-bd6a-3432eb40cb18-var-lock\") pod \"installer-9-crc\" (UID: \"1c7ffc91-beb4-48c9-bd6a-3432eb40cb18\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 13:28:41 crc kubenswrapper[4804]: I0217 13:28:41.173443 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c7ffc91-beb4-48c9-bd6a-3432eb40cb18-kubelet-dir\") pod \"installer-9-crc\" (UID: \"1c7ffc91-beb4-48c9-bd6a-3432eb40cb18\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 13:28:41 crc kubenswrapper[4804]: I0217 13:28:41.173529 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c7ffc91-beb4-48c9-bd6a-3432eb40cb18-kubelet-dir\") pod \"installer-9-crc\" (UID: \"1c7ffc91-beb4-48c9-bd6a-3432eb40cb18\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 13:28:41 crc kubenswrapper[4804]: I0217 13:28:41.173579 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1c7ffc91-beb4-48c9-bd6a-3432eb40cb18-var-lock\") pod \"installer-9-crc\" (UID: \"1c7ffc91-beb4-48c9-bd6a-3432eb40cb18\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 13:28:41 crc kubenswrapper[4804]: I0217 13:28:41.193308 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c7ffc91-beb4-48c9-bd6a-3432eb40cb18-kube-api-access\") pod \"installer-9-crc\" (UID: \"1c7ffc91-beb4-48c9-bd6a-3432eb40cb18\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 13:28:42 crc kubenswrapper[4804]: I0217 13:28:42.129569 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 13:28:42 crc kubenswrapper[4804]: E0217 13:28:42.193604 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-hpw7w" podUID="cbda9f29-b199-4a42-8757-f5ecc90f0437" Feb 17 13:28:42 crc kubenswrapper[4804]: E0217 13:28:42.274354 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 17 13:28:42 crc kubenswrapper[4804]: E0217 13:28:42.274545 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hh9rr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-dfpnq_openshift-marketplace(af8f355f-84e5-49b0-83f4-b87ce7bb4015): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 13:28:42 crc kubenswrapper[4804]: E0217 13:28:42.276568 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-dfpnq" podUID="af8f355f-84e5-49b0-83f4-b87ce7bb4015" Feb 17 13:28:42 crc kubenswrapper[4804]: E0217 13:28:42.481559 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 17 13:28:42 crc kubenswrapper[4804]: E0217 13:28:42.481712 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jmjk4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-c4fxk_openshift-marketplace(3d715b9f-61c8-4851-a4b1-452f9f3ea8bd): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 13:28:42 crc kubenswrapper[4804]: E0217 13:28:42.483053 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-c4fxk" podUID="3d715b9f-61c8-4851-a4b1-452f9f3ea8bd" Feb 17 13:28:44 crc kubenswrapper[4804]: E0217 13:28:44.436620 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 17 13:28:44 crc kubenswrapper[4804]: E0217 13:28:44.437094 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nf9xm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-f9k56_openshift-marketplace(dd3f4542-6055-4524-9e05-58b4c9a16e37): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 13:28:44 crc kubenswrapper[4804]: E0217 13:28:44.438289 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-f9k56" podUID="dd3f4542-6055-4524-9e05-58b4c9a16e37" Feb 17 13:28:45 crc kubenswrapper[4804]: E0217 13:28:45.286138 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 17 13:28:45 crc kubenswrapper[4804]: E0217 13:28:45.286345 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pm9gj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-xf58f_openshift-marketplace(4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 13:28:45 crc kubenswrapper[4804]: E0217 13:28:45.288612 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-xf58f" podUID="4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5" Feb 17 13:28:48 crc kubenswrapper[4804]: E0217 13:28:48.041239 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-xf58f" podUID="4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5" Feb 17 13:28:48 crc kubenswrapper[4804]: E0217 13:28:48.041738 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-f9k56" podUID="dd3f4542-6055-4524-9e05-58b4c9a16e37" Feb 17 13:28:48 crc kubenswrapper[4804]: E0217 13:28:48.041823 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-dfpnq" podUID="af8f355f-84e5-49b0-83f4-b87ce7bb4015" Feb 17 13:28:48 crc kubenswrapper[4804]: E0217 13:28:48.041901 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-c4fxk" podUID="3d715b9f-61c8-4851-a4b1-452f9f3ea8bd" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.097716 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.130043 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-66f58dbd5-dlsdn"] Feb 17 13:28:48 crc kubenswrapper[4804]: E0217 13:28:48.130278 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d929eaa-807c-4809-8b8a-78c186418e71" containerName="controller-manager" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.130289 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d929eaa-807c-4809-8b8a-78c186418e71" containerName="controller-manager" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.130385 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d929eaa-807c-4809-8b8a-78c186418e71" containerName="controller-manager" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.130759 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.139714 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66f58dbd5-dlsdn"] Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.166493 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" event={"ID":"e77722ba-d383-442c-b6dc-9983cf233257","Type":"ContainerStarted","Data":"828853758eab48da037c771cf13c8e4fb60cb60ab76a545908fd820fdf6be8a4"} Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.168266 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" event={"ID":"1d929eaa-807c-4809-8b8a-78c186418e71","Type":"ContainerDied","Data":"c622d293f5967334c96859bbffeac805786523250407581ac4cdc458a4cd4b45"} Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.168364 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-j8ggj" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.215011 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1d929eaa-807c-4809-8b8a-78c186418e71-proxy-ca-bundles\") pod \"1d929eaa-807c-4809-8b8a-78c186418e71\" (UID: \"1d929eaa-807c-4809-8b8a-78c186418e71\") " Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.215084 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d929eaa-807c-4809-8b8a-78c186418e71-serving-cert\") pod \"1d929eaa-807c-4809-8b8a-78c186418e71\" (UID: \"1d929eaa-807c-4809-8b8a-78c186418e71\") " Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.215123 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8x554\" (UniqueName: \"kubernetes.io/projected/1d929eaa-807c-4809-8b8a-78c186418e71-kube-api-access-8x554\") pod \"1d929eaa-807c-4809-8b8a-78c186418e71\" (UID: \"1d929eaa-807c-4809-8b8a-78c186418e71\") " Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.215156 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d929eaa-807c-4809-8b8a-78c186418e71-client-ca\") pod \"1d929eaa-807c-4809-8b8a-78c186418e71\" (UID: \"1d929eaa-807c-4809-8b8a-78c186418e71\") " Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.215246 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d929eaa-807c-4809-8b8a-78c186418e71-config\") pod \"1d929eaa-807c-4809-8b8a-78c186418e71\" (UID: \"1d929eaa-807c-4809-8b8a-78c186418e71\") " Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.216037 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d929eaa-807c-4809-8b8a-78c186418e71-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1d929eaa-807c-4809-8b8a-78c186418e71" (UID: "1d929eaa-807c-4809-8b8a-78c186418e71"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.216144 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d929eaa-807c-4809-8b8a-78c186418e71-client-ca" (OuterVolumeSpecName: "client-ca") pod "1d929eaa-807c-4809-8b8a-78c186418e71" (UID: "1d929eaa-807c-4809-8b8a-78c186418e71"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.216310 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d929eaa-807c-4809-8b8a-78c186418e71-config" (OuterVolumeSpecName: "config") pod "1d929eaa-807c-4809-8b8a-78c186418e71" (UID: "1d929eaa-807c-4809-8b8a-78c186418e71"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.221782 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d929eaa-807c-4809-8b8a-78c186418e71-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1d929eaa-807c-4809-8b8a-78c186418e71" (UID: "1d929eaa-807c-4809-8b8a-78c186418e71"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.222187 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d929eaa-807c-4809-8b8a-78c186418e71-kube-api-access-8x554" (OuterVolumeSpecName: "kube-api-access-8x554") pod "1d929eaa-807c-4809-8b8a-78c186418e71" (UID: "1d929eaa-807c-4809-8b8a-78c186418e71"). InnerVolumeSpecName "kube-api-access-8x554". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.316973 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9631847b-1aa3-4bbd-95d4-cee45d896b11-config\") pod \"controller-manager-66f58dbd5-dlsdn\" (UID: \"9631847b-1aa3-4bbd-95d4-cee45d896b11\") " pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.317853 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9631847b-1aa3-4bbd-95d4-cee45d896b11-proxy-ca-bundles\") pod \"controller-manager-66f58dbd5-dlsdn\" (UID: \"9631847b-1aa3-4bbd-95d4-cee45d896b11\") " pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.317929 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cpwq\" (UniqueName: \"kubernetes.io/projected/9631847b-1aa3-4bbd-95d4-cee45d896b11-kube-api-access-7cpwq\") pod \"controller-manager-66f58dbd5-dlsdn\" (UID: \"9631847b-1aa3-4bbd-95d4-cee45d896b11\") " pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.318108 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9631847b-1aa3-4bbd-95d4-cee45d896b11-serving-cert\") pod \"controller-manager-66f58dbd5-dlsdn\" (UID: \"9631847b-1aa3-4bbd-95d4-cee45d896b11\") " pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.318226 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9631847b-1aa3-4bbd-95d4-cee45d896b11-client-ca\") pod \"controller-manager-66f58dbd5-dlsdn\" (UID: \"9631847b-1aa3-4bbd-95d4-cee45d896b11\") " pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.318297 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d929eaa-807c-4809-8b8a-78c186418e71-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.318315 4804 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1d929eaa-807c-4809-8b8a-78c186418e71-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.318330 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d929eaa-807c-4809-8b8a-78c186418e71-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.318346 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8x554\" (UniqueName: \"kubernetes.io/projected/1d929eaa-807c-4809-8b8a-78c186418e71-kube-api-access-8x554\") on node \"crc\" DevicePath \"\"" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.318359 4804 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d929eaa-807c-4809-8b8a-78c186418e71-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.419256 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9631847b-1aa3-4bbd-95d4-cee45d896b11-proxy-ca-bundles\") pod \"controller-manager-66f58dbd5-dlsdn\" (UID: \"9631847b-1aa3-4bbd-95d4-cee45d896b11\") " pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.419326 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cpwq\" (UniqueName: \"kubernetes.io/projected/9631847b-1aa3-4bbd-95d4-cee45d896b11-kube-api-access-7cpwq\") pod \"controller-manager-66f58dbd5-dlsdn\" (UID: \"9631847b-1aa3-4bbd-95d4-cee45d896b11\") " pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.419413 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9631847b-1aa3-4bbd-95d4-cee45d896b11-serving-cert\") pod \"controller-manager-66f58dbd5-dlsdn\" (UID: \"9631847b-1aa3-4bbd-95d4-cee45d896b11\") " pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.419470 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9631847b-1aa3-4bbd-95d4-cee45d896b11-client-ca\") pod \"controller-manager-66f58dbd5-dlsdn\" (UID: \"9631847b-1aa3-4bbd-95d4-cee45d896b11\") " pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.419543 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9631847b-1aa3-4bbd-95d4-cee45d896b11-config\") pod \"controller-manager-66f58dbd5-dlsdn\" (UID: \"9631847b-1aa3-4bbd-95d4-cee45d896b11\") " pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.510297 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-j8ggj"] Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.515274 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-j8ggj"] Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.581587 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d929eaa-807c-4809-8b8a-78c186418e71" path="/var/lib/kubelet/pods/1d929eaa-807c-4809-8b8a-78c186418e71/volumes" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.610785 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9631847b-1aa3-4bbd-95d4-cee45d896b11-client-ca\") pod \"controller-manager-66f58dbd5-dlsdn\" (UID: \"9631847b-1aa3-4bbd-95d4-cee45d896b11\") " pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.610825 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9631847b-1aa3-4bbd-95d4-cee45d896b11-proxy-ca-bundles\") pod \"controller-manager-66f58dbd5-dlsdn\" (UID: \"9631847b-1aa3-4bbd-95d4-cee45d896b11\") " pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.611644 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9631847b-1aa3-4bbd-95d4-cee45d896b11-config\") pod \"controller-manager-66f58dbd5-dlsdn\" (UID: \"9631847b-1aa3-4bbd-95d4-cee45d896b11\") " pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.613741 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9631847b-1aa3-4bbd-95d4-cee45d896b11-serving-cert\") pod \"controller-manager-66f58dbd5-dlsdn\" (UID: \"9631847b-1aa3-4bbd-95d4-cee45d896b11\") " pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.614937 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cpwq\" (UniqueName: \"kubernetes.io/projected/9631847b-1aa3-4bbd-95d4-cee45d896b11-kube-api-access-7cpwq\") pod \"controller-manager-66f58dbd5-dlsdn\" (UID: \"9631847b-1aa3-4bbd-95d4-cee45d896b11\") " pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:28:48 crc kubenswrapper[4804]: I0217 13:28:48.910189 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:28:49 crc kubenswrapper[4804]: E0217 13:28:49.022382 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 17 13:28:49 crc kubenswrapper[4804]: E0217 13:28:49.022589 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zdxzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-fvtl6_openshift-marketplace(6a10f4e7-7906-43aa-98fb-e709a71a55d2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 13:28:49 crc kubenswrapper[4804]: E0217 13:28:49.023874 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-fvtl6" podUID="6a10f4e7-7906-43aa-98fb-e709a71a55d2" Feb 17 13:28:49 crc kubenswrapper[4804]: I0217 13:28:49.050491 4804 scope.go:117] "RemoveContainer" containerID="e85184210391718c97e4b64df6d5ddb787255a643b112757a5bd89ac1f1c1ad2" Feb 17 13:28:49 crc kubenswrapper[4804]: E0217 13:28:49.217128 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-fvtl6" podUID="6a10f4e7-7906-43aa-98fb-e709a71a55d2" Feb 17 13:28:49 crc kubenswrapper[4804]: I0217 13:28:49.299447 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8"] Feb 17 13:28:49 crc kubenswrapper[4804]: I0217 13:28:49.515749 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66f58dbd5-dlsdn"] Feb 17 13:28:49 crc kubenswrapper[4804]: W0217 13:28:49.529217 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9631847b_1aa3_4bbd_95d4_cee45d896b11.slice/crio-2e84da0c7befea7833b925b3ff40e336177c9ccd82633eca63155bf470709de5 WatchSource:0}: Error finding container 2e84da0c7befea7833b925b3ff40e336177c9ccd82633eca63155bf470709de5: Status 404 returned error can't find the container with id 2e84da0c7befea7833b925b3ff40e336177c9ccd82633eca63155bf470709de5 Feb 17 13:28:49 crc kubenswrapper[4804]: I0217 13:28:49.568446 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 17 13:28:49 crc kubenswrapper[4804]: W0217 13:28:49.569568 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1c7ffc91_beb4_48c9_bd6a_3432eb40cb18.slice/crio-c1dfeb9653f0eeda2384bf17db4b08547b2f5b249fa71fc9c90c6e17ca6502f1 WatchSource:0}: Error finding container c1dfeb9653f0eeda2384bf17db4b08547b2f5b249fa71fc9c90c6e17ca6502f1: Status 404 returned error can't find the container with id c1dfeb9653f0eeda2384bf17db4b08547b2f5b249fa71fc9c90c6e17ca6502f1 Feb 17 13:28:49 crc kubenswrapper[4804]: E0217 13:28:49.638382 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 17 13:28:49 crc kubenswrapper[4804]: E0217 13:28:49.638895 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rsj25,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-j44f8_openshift-marketplace(4627be0e-b7ba-4e46-820b-0ce1271ecacb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 13:28:49 crc kubenswrapper[4804]: E0217 13:28:49.640037 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-j44f8" podUID="4627be0e-b7ba-4e46-820b-0ce1271ecacb" Feb 17 13:28:49 crc kubenswrapper[4804]: I0217 13:28:49.671600 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 17 13:28:49 crc kubenswrapper[4804]: W0217 13:28:49.691155 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9dda4da8_c5ea_4c8a_8443_d7e31eba95af.slice/crio-e438a86f0fff2efb5b7b2bf2b31c131dd07258affb80be6cb3518cd31c2549e1 WatchSource:0}: Error finding container e438a86f0fff2efb5b7b2bf2b31c131dd07258affb80be6cb3518cd31c2549e1: Status 404 returned error can't find the container with id e438a86f0fff2efb5b7b2bf2b31c131dd07258affb80be6cb3518cd31c2549e1 Feb 17 13:28:50 crc kubenswrapper[4804]: I0217 13:28:50.185634 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"9dda4da8-c5ea-4c8a-8443-d7e31eba95af","Type":"ContainerStarted","Data":"e438a86f0fff2efb5b7b2bf2b31c131dd07258affb80be6cb3518cd31c2549e1"} Feb 17 13:28:50 crc kubenswrapper[4804]: I0217 13:28:50.188890 4804 generic.go:334] "Generic (PLEG): container finished" podID="5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df" containerID="b18e486af5e04ed68c256a5f17c4e5bc48b129f17f0b27df8fa47ac15336cfef" exitCode=0 Feb 17 13:28:50 crc kubenswrapper[4804]: I0217 13:28:50.189009 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54w49" event={"ID":"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df","Type":"ContainerDied","Data":"b18e486af5e04ed68c256a5f17c4e5bc48b129f17f0b27df8fa47ac15336cfef"} Feb 17 13:28:50 crc kubenswrapper[4804]: I0217 13:28:50.191764 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" event={"ID":"e77722ba-d383-442c-b6dc-9983cf233257","Type":"ContainerStarted","Data":"cf47b71406f4bbe5bd193f96862e508e5e5e5461f1ee8dd7f13c9d10769af71a"} Feb 17 13:28:50 crc kubenswrapper[4804]: I0217 13:28:50.196728 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1c7ffc91-beb4-48c9-bd6a-3432eb40cb18","Type":"ContainerStarted","Data":"c1dfeb9653f0eeda2384bf17db4b08547b2f5b249fa71fc9c90c6e17ca6502f1"} Feb 17 13:28:50 crc kubenswrapper[4804]: I0217 13:28:50.202490 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" event={"ID":"9631847b-1aa3-4bbd-95d4-cee45d896b11","Type":"ContainerStarted","Data":"2e84da0c7befea7833b925b3ff40e336177c9ccd82633eca63155bf470709de5"} Feb 17 13:28:50 crc kubenswrapper[4804]: I0217 13:28:50.204416 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" event={"ID":"2cc5d152-9369-4574-ab6b-05d9d4c5afd7","Type":"ContainerStarted","Data":"a9ed597c3c00b14d9496b5cdcd3501fa4654fd60a6b054f4df6ff45fd2626a2f"} Feb 17 13:28:50 crc kubenswrapper[4804]: I0217 13:28:50.204498 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" event={"ID":"2cc5d152-9369-4574-ab6b-05d9d4c5afd7","Type":"ContainerStarted","Data":"11a6eeb787a54a0159bc228994e888bc7b3352ae3c3c245dcc87e80f7f925b09"} Feb 17 13:28:50 crc kubenswrapper[4804]: I0217 13:28:50.204619 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" podUID="2cc5d152-9369-4574-ab6b-05d9d4c5afd7" containerName="route-controller-manager" containerID="cri-o://a9ed597c3c00b14d9496b5cdcd3501fa4654fd60a6b054f4df6ff45fd2626a2f" gracePeriod=30 Feb 17 13:28:50 crc kubenswrapper[4804]: E0217 13:28:50.207020 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-j44f8" podUID="4627be0e-b7ba-4e46-820b-0ce1271ecacb" Feb 17 13:28:50 crc kubenswrapper[4804]: I0217 13:28:50.240498 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" podStartSLOduration=35.240475917 podStartE2EDuration="35.240475917s" podCreationTimestamp="2026-02-17 13:28:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:28:50.238131759 +0000 UTC m=+204.349551106" watchObservedRunningTime="2026-02-17 13:28:50.240475917 +0000 UTC m=+204.351895254" Feb 17 13:28:51 crc kubenswrapper[4804]: I0217 13:28:51.213080 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"9dda4da8-c5ea-4c8a-8443-d7e31eba95af","Type":"ContainerStarted","Data":"6155d5f4b6c5d243b45066428b06822d531e0daedd2837b5de7761b60473e5c3"} Feb 17 13:28:51 crc kubenswrapper[4804]: I0217 13:28:51.216029 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4jfgm" event={"ID":"e77722ba-d383-442c-b6dc-9983cf233257","Type":"ContainerStarted","Data":"ac736acdb9f4adaaf3c6fe0c81a3865edef47271655b25ba995d0941cd6f23c6"} Feb 17 13:28:51 crc kubenswrapper[4804]: I0217 13:28:51.220026 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1c7ffc91-beb4-48c9-bd6a-3432eb40cb18","Type":"ContainerStarted","Data":"c268cbeacb8edca4cf6be1f9ade9d17e4f9a777b74947e1265bd5b8b02378689"} Feb 17 13:28:51 crc kubenswrapper[4804]: I0217 13:28:51.221801 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" event={"ID":"9631847b-1aa3-4bbd-95d4-cee45d896b11","Type":"ContainerStarted","Data":"cde3dcc2944a4f66700d47606f5e6e510987af0087414aebc458b9e7ef37b2f6"} Feb 17 13:28:51 crc kubenswrapper[4804]: I0217 13:28:51.222152 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:28:51 crc kubenswrapper[4804]: I0217 13:28:51.225424 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=16.22540711 podStartE2EDuration="16.22540711s" podCreationTimestamp="2026-02-17 13:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:28:51.224440518 +0000 UTC m=+205.335859855" watchObservedRunningTime="2026-02-17 13:28:51.22540711 +0000 UTC m=+205.336826447" Feb 17 13:28:51 crc kubenswrapper[4804]: I0217 13:28:51.229033 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-6bb46c8d9c-hxxp8_2cc5d152-9369-4574-ab6b-05d9d4c5afd7/route-controller-manager/0.log" Feb 17 13:28:51 crc kubenswrapper[4804]: I0217 13:28:51.229091 4804 generic.go:334] "Generic (PLEG): container finished" podID="2cc5d152-9369-4574-ab6b-05d9d4c5afd7" containerID="a9ed597c3c00b14d9496b5cdcd3501fa4654fd60a6b054f4df6ff45fd2626a2f" exitCode=255 Feb 17 13:28:51 crc kubenswrapper[4804]: I0217 13:28:51.229123 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" event={"ID":"2cc5d152-9369-4574-ab6b-05d9d4c5afd7","Type":"ContainerDied","Data":"a9ed597c3c00b14d9496b5cdcd3501fa4654fd60a6b054f4df6ff45fd2626a2f"} Feb 17 13:28:51 crc kubenswrapper[4804]: I0217 13:28:51.230881 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:28:51 crc kubenswrapper[4804]: I0217 13:28:51.242469 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-4jfgm" podStartSLOduration=184.242447388 podStartE2EDuration="3m4.242447388s" podCreationTimestamp="2026-02-17 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:28:51.240019477 +0000 UTC m=+205.351438824" watchObservedRunningTime="2026-02-17 13:28:51.242447388 +0000 UTC m=+205.353866725" Feb 17 13:28:51 crc kubenswrapper[4804]: I0217 13:28:51.257274 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=11.257255192 podStartE2EDuration="11.257255192s" podCreationTimestamp="2026-02-17 13:28:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:28:51.255639149 +0000 UTC m=+205.367058486" watchObservedRunningTime="2026-02-17 13:28:51.257255192 +0000 UTC m=+205.368675069" Feb 17 13:28:51 crc kubenswrapper[4804]: I0217 13:28:51.272486 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" podStartSLOduration=16.27247132 podStartE2EDuration="16.27247132s" podCreationTimestamp="2026-02-17 13:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:28:51.270936669 +0000 UTC m=+205.382356016" watchObservedRunningTime="2026-02-17 13:28:51.27247132 +0000 UTC m=+205.383890657" Feb 17 13:28:55 crc kubenswrapper[4804]: I0217 13:28:55.141691 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" Feb 17 13:28:55 crc kubenswrapper[4804]: I0217 13:28:55.835870 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:28:55 crc kubenswrapper[4804]: I0217 13:28:55.835986 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:28:55 crc kubenswrapper[4804]: I0217 13:28:55.836066 4804 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:28:55 crc kubenswrapper[4804]: I0217 13:28:55.837061 4804 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b"} pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 13:28:55 crc kubenswrapper[4804]: I0217 13:28:55.837248 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" containerID="cri-o://526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b" gracePeriod=600 Feb 17 13:28:56 crc kubenswrapper[4804]: I0217 13:28:56.141865 4804 patch_prober.go:28] interesting pod/route-controller-manager-6bb46c8d9c-hxxp8 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.54:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 13:28:56 crc kubenswrapper[4804]: I0217 13:28:56.142455 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" podUID="2cc5d152-9369-4574-ab6b-05d9d4c5afd7" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.54:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.270998 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-6bb46c8d9c-hxxp8_2cc5d152-9369-4574-ab6b-05d9d4c5afd7/route-controller-manager/0.log" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.271056 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" event={"ID":"2cc5d152-9369-4574-ab6b-05d9d4c5afd7","Type":"ContainerDied","Data":"11a6eeb787a54a0159bc228994e888bc7b3352ae3c3c245dcc87e80f7f925b09"} Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.271085 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11a6eeb787a54a0159bc228994e888bc7b3352ae3c3c245dcc87e80f7f925b09" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.311269 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-6bb46c8d9c-hxxp8_2cc5d152-9369-4574-ab6b-05d9d4c5afd7/route-controller-manager/0.log" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.311344 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.378956 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k"] Feb 17 13:28:57 crc kubenswrapper[4804]: E0217 13:28:57.379243 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cc5d152-9369-4574-ab6b-05d9d4c5afd7" containerName="route-controller-manager" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.379262 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cc5d152-9369-4574-ab6b-05d9d4c5afd7" containerName="route-controller-manager" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.379386 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cc5d152-9369-4574-ab6b-05d9d4c5afd7" containerName="route-controller-manager" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.379806 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.388944 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k"] Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.471928 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-client-ca\") pod \"2cc5d152-9369-4574-ab6b-05d9d4c5afd7\" (UID: \"2cc5d152-9369-4574-ab6b-05d9d4c5afd7\") " Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.472285 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-serving-cert\") pod \"2cc5d152-9369-4574-ab6b-05d9d4c5afd7\" (UID: \"2cc5d152-9369-4574-ab6b-05d9d4c5afd7\") " Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.472442 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-config\") pod \"2cc5d152-9369-4574-ab6b-05d9d4c5afd7\" (UID: \"2cc5d152-9369-4574-ab6b-05d9d4c5afd7\") " Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.472559 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l74sp\" (UniqueName: \"kubernetes.io/projected/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-kube-api-access-l74sp\") pod \"2cc5d152-9369-4574-ab6b-05d9d4c5afd7\" (UID: \"2cc5d152-9369-4574-ab6b-05d9d4c5afd7\") " Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.472796 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b710ce8a-f177-4c60-b8d5-bbf18bf38737-serving-cert\") pod \"route-controller-manager-f687946cc-tvs6k\" (UID: \"b710ce8a-f177-4c60-b8d5-bbf18bf38737\") " pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.472869 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-client-ca" (OuterVolumeSpecName: "client-ca") pod "2cc5d152-9369-4574-ab6b-05d9d4c5afd7" (UID: "2cc5d152-9369-4574-ab6b-05d9d4c5afd7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.472995 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-config" (OuterVolumeSpecName: "config") pod "2cc5d152-9369-4574-ab6b-05d9d4c5afd7" (UID: "2cc5d152-9369-4574-ab6b-05d9d4c5afd7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.473025 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bxfh\" (UniqueName: \"kubernetes.io/projected/b710ce8a-f177-4c60-b8d5-bbf18bf38737-kube-api-access-7bxfh\") pod \"route-controller-manager-f687946cc-tvs6k\" (UID: \"b710ce8a-f177-4c60-b8d5-bbf18bf38737\") " pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.473146 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b710ce8a-f177-4c60-b8d5-bbf18bf38737-config\") pod \"route-controller-manager-f687946cc-tvs6k\" (UID: \"b710ce8a-f177-4c60-b8d5-bbf18bf38737\") " pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.473196 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b710ce8a-f177-4c60-b8d5-bbf18bf38737-client-ca\") pod \"route-controller-manager-f687946cc-tvs6k\" (UID: \"b710ce8a-f177-4c60-b8d5-bbf18bf38737\") " pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.473303 4804 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.473320 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.477408 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-kube-api-access-l74sp" (OuterVolumeSpecName: "kube-api-access-l74sp") pod "2cc5d152-9369-4574-ab6b-05d9d4c5afd7" (UID: "2cc5d152-9369-4574-ab6b-05d9d4c5afd7"). InnerVolumeSpecName "kube-api-access-l74sp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.478348 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2cc5d152-9369-4574-ab6b-05d9d4c5afd7" (UID: "2cc5d152-9369-4574-ab6b-05d9d4c5afd7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.574240 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b710ce8a-f177-4c60-b8d5-bbf18bf38737-client-ca\") pod \"route-controller-manager-f687946cc-tvs6k\" (UID: \"b710ce8a-f177-4c60-b8d5-bbf18bf38737\") " pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.574312 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b710ce8a-f177-4c60-b8d5-bbf18bf38737-serving-cert\") pod \"route-controller-manager-f687946cc-tvs6k\" (UID: \"b710ce8a-f177-4c60-b8d5-bbf18bf38737\") " pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.574350 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bxfh\" (UniqueName: \"kubernetes.io/projected/b710ce8a-f177-4c60-b8d5-bbf18bf38737-kube-api-access-7bxfh\") pod \"route-controller-manager-f687946cc-tvs6k\" (UID: \"b710ce8a-f177-4c60-b8d5-bbf18bf38737\") " pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.574382 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b710ce8a-f177-4c60-b8d5-bbf18bf38737-config\") pod \"route-controller-manager-f687946cc-tvs6k\" (UID: \"b710ce8a-f177-4c60-b8d5-bbf18bf38737\") " pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.574543 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.574945 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l74sp\" (UniqueName: \"kubernetes.io/projected/2cc5d152-9369-4574-ab6b-05d9d4c5afd7-kube-api-access-l74sp\") on node \"crc\" DevicePath \"\"" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.575490 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b710ce8a-f177-4c60-b8d5-bbf18bf38737-client-ca\") pod \"route-controller-manager-f687946cc-tvs6k\" (UID: \"b710ce8a-f177-4c60-b8d5-bbf18bf38737\") " pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.576567 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b710ce8a-f177-4c60-b8d5-bbf18bf38737-config\") pod \"route-controller-manager-f687946cc-tvs6k\" (UID: \"b710ce8a-f177-4c60-b8d5-bbf18bf38737\") " pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.579439 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b710ce8a-f177-4c60-b8d5-bbf18bf38737-serving-cert\") pod \"route-controller-manager-f687946cc-tvs6k\" (UID: \"b710ce8a-f177-4c60-b8d5-bbf18bf38737\") " pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.591411 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bxfh\" (UniqueName: \"kubernetes.io/projected/b710ce8a-f177-4c60-b8d5-bbf18bf38737-kube-api-access-7bxfh\") pod \"route-controller-manager-f687946cc-tvs6k\" (UID: \"b710ce8a-f177-4c60-b8d5-bbf18bf38737\") " pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" Feb 17 13:28:57 crc kubenswrapper[4804]: I0217 13:28:57.700768 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" Feb 17 13:28:58 crc kubenswrapper[4804]: I0217 13:28:58.277249 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8" Feb 17 13:28:58 crc kubenswrapper[4804]: I0217 13:28:58.314556 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8"] Feb 17 13:28:58 crc kubenswrapper[4804]: I0217 13:28:58.317540 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bb46c8d9c-hxxp8"] Feb 17 13:28:58 crc kubenswrapper[4804]: I0217 13:28:58.474544 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k"] Feb 17 13:28:58 crc kubenswrapper[4804]: I0217 13:28:58.582603 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cc5d152-9369-4574-ab6b-05d9d4c5afd7" path="/var/lib/kubelet/pods/2cc5d152-9369-4574-ab6b-05d9d4c5afd7/volumes" Feb 17 13:28:59 crc kubenswrapper[4804]: I0217 13:28:59.298348 4804 generic.go:334] "Generic (PLEG): container finished" podID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerID="526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b" exitCode=0 Feb 17 13:28:59 crc kubenswrapper[4804]: I0217 13:28:59.298880 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerDied","Data":"526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b"} Feb 17 13:28:59 crc kubenswrapper[4804]: I0217 13:28:59.298911 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerStarted","Data":"45ca7a269d38a09ffdae5bd556d7eb92def23c8a4cf6319c270308b20dd056c7"} Feb 17 13:28:59 crc kubenswrapper[4804]: I0217 13:28:59.301869 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpw7w" event={"ID":"cbda9f29-b199-4a42-8757-f5ecc90f0437","Type":"ContainerStarted","Data":"3c6feed8e302b106af248336d8cbbdff9d744222eb1b31aea9ea024adc236f19"} Feb 17 13:28:59 crc kubenswrapper[4804]: I0217 13:28:59.304936 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" event={"ID":"b710ce8a-f177-4c60-b8d5-bbf18bf38737","Type":"ContainerStarted","Data":"1e274226abe5afcd191e577cb5faf8a1c529d1fa501d5a64c7f114b18af605c1"} Feb 17 13:28:59 crc kubenswrapper[4804]: I0217 13:28:59.304990 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" event={"ID":"b710ce8a-f177-4c60-b8d5-bbf18bf38737","Type":"ContainerStarted","Data":"558d5dd2eecf846742fd5b4dd243c32953c0fb248ec2faa9cde568927170e4d7"} Feb 17 13:28:59 crc kubenswrapper[4804]: I0217 13:28:59.305790 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" Feb 17 13:28:59 crc kubenswrapper[4804]: I0217 13:28:59.307632 4804 generic.go:334] "Generic (PLEG): container finished" podID="9dda4da8-c5ea-4c8a-8443-d7e31eba95af" containerID="6155d5f4b6c5d243b45066428b06822d531e0daedd2837b5de7761b60473e5c3" exitCode=0 Feb 17 13:28:59 crc kubenswrapper[4804]: I0217 13:28:59.307688 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"9dda4da8-c5ea-4c8a-8443-d7e31eba95af","Type":"ContainerDied","Data":"6155d5f4b6c5d243b45066428b06822d531e0daedd2837b5de7761b60473e5c3"} Feb 17 13:28:59 crc kubenswrapper[4804]: I0217 13:28:59.309105 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54w49" event={"ID":"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df","Type":"ContainerStarted","Data":"b1df388928454341d2cfda489583d80c64925db6adb630d1962fca880505bbbd"} Feb 17 13:28:59 crc kubenswrapper[4804]: I0217 13:28:59.315260 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" Feb 17 13:28:59 crc kubenswrapper[4804]: I0217 13:28:59.366057 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-54w49" podStartSLOduration=4.762243784 podStartE2EDuration="1m2.366041176s" podCreationTimestamp="2026-02-17 13:27:57 +0000 UTC" firstStartedPulling="2026-02-17 13:28:00.907756174 +0000 UTC m=+155.019175511" lastFinishedPulling="2026-02-17 13:28:58.511553566 +0000 UTC m=+212.622972903" observedRunningTime="2026-02-17 13:28:59.363975367 +0000 UTC m=+213.475394704" watchObservedRunningTime="2026-02-17 13:28:59.366041176 +0000 UTC m=+213.477460513" Feb 17 13:28:59 crc kubenswrapper[4804]: I0217 13:28:59.387043 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" podStartSLOduration=24.387022166 podStartE2EDuration="24.387022166s" podCreationTimestamp="2026-02-17 13:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:28:59.381490801 +0000 UTC m=+213.492910148" watchObservedRunningTime="2026-02-17 13:28:59.387022166 +0000 UTC m=+213.498441503" Feb 17 13:29:00 crc kubenswrapper[4804]: I0217 13:29:00.315611 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xf58f" event={"ID":"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5","Type":"ContainerStarted","Data":"de147f3e2e549f3082d93cbb71b29ca9b9c52bb022b8b09a8bc2c4b12089d55e"} Feb 17 13:29:00 crc kubenswrapper[4804]: I0217 13:29:00.320289 4804 generic.go:334] "Generic (PLEG): container finished" podID="cbda9f29-b199-4a42-8757-f5ecc90f0437" containerID="3c6feed8e302b106af248336d8cbbdff9d744222eb1b31aea9ea024adc236f19" exitCode=0 Feb 17 13:29:00 crc kubenswrapper[4804]: I0217 13:29:00.320487 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpw7w" event={"ID":"cbda9f29-b199-4a42-8757-f5ecc90f0437","Type":"ContainerDied","Data":"3c6feed8e302b106af248336d8cbbdff9d744222eb1b31aea9ea024adc236f19"} Feb 17 13:29:00 crc kubenswrapper[4804]: I0217 13:29:00.631530 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 13:29:00 crc kubenswrapper[4804]: I0217 13:29:00.826715 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9dda4da8-c5ea-4c8a-8443-d7e31eba95af-kube-api-access\") pod \"9dda4da8-c5ea-4c8a-8443-d7e31eba95af\" (UID: \"9dda4da8-c5ea-4c8a-8443-d7e31eba95af\") " Feb 17 13:29:00 crc kubenswrapper[4804]: I0217 13:29:00.827133 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9dda4da8-c5ea-4c8a-8443-d7e31eba95af-kubelet-dir\") pod \"9dda4da8-c5ea-4c8a-8443-d7e31eba95af\" (UID: \"9dda4da8-c5ea-4c8a-8443-d7e31eba95af\") " Feb 17 13:29:00 crc kubenswrapper[4804]: I0217 13:29:00.827458 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9dda4da8-c5ea-4c8a-8443-d7e31eba95af-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9dda4da8-c5ea-4c8a-8443-d7e31eba95af" (UID: "9dda4da8-c5ea-4c8a-8443-d7e31eba95af"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:29:00 crc kubenswrapper[4804]: I0217 13:29:00.832488 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dda4da8-c5ea-4c8a-8443-d7e31eba95af-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9dda4da8-c5ea-4c8a-8443-d7e31eba95af" (UID: "9dda4da8-c5ea-4c8a-8443-d7e31eba95af"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:29:00 crc kubenswrapper[4804]: I0217 13:29:00.929431 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9dda4da8-c5ea-4c8a-8443-d7e31eba95af-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:00 crc kubenswrapper[4804]: I0217 13:29:00.929493 4804 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9dda4da8-c5ea-4c8a-8443-d7e31eba95af-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:01 crc kubenswrapper[4804]: I0217 13:29:01.326858 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 13:29:01 crc kubenswrapper[4804]: I0217 13:29:01.327019 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"9dda4da8-c5ea-4c8a-8443-d7e31eba95af","Type":"ContainerDied","Data":"e438a86f0fff2efb5b7b2bf2b31c131dd07258affb80be6cb3518cd31c2549e1"} Feb 17 13:29:01 crc kubenswrapper[4804]: I0217 13:29:01.327061 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e438a86f0fff2efb5b7b2bf2b31c131dd07258affb80be6cb3518cd31c2549e1" Feb 17 13:29:01 crc kubenswrapper[4804]: I0217 13:29:01.328759 4804 generic.go:334] "Generic (PLEG): container finished" podID="4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5" containerID="de147f3e2e549f3082d93cbb71b29ca9b9c52bb022b8b09a8bc2c4b12089d55e" exitCode=0 Feb 17 13:29:01 crc kubenswrapper[4804]: I0217 13:29:01.328804 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xf58f" event={"ID":"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5","Type":"ContainerDied","Data":"de147f3e2e549f3082d93cbb71b29ca9b9c52bb022b8b09a8bc2c4b12089d55e"} Feb 17 13:29:01 crc kubenswrapper[4804]: I0217 13:29:01.332098 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9k56" event={"ID":"dd3f4542-6055-4524-9e05-58b4c9a16e37","Type":"ContainerDied","Data":"01f4c85dd4ed77fe0ae4fd3853fe066d8a2f72e40a5062bcaf2b92496b6c83fc"} Feb 17 13:29:01 crc kubenswrapper[4804]: I0217 13:29:01.332002 4804 generic.go:334] "Generic (PLEG): container finished" podID="dd3f4542-6055-4524-9e05-58b4c9a16e37" containerID="01f4c85dd4ed77fe0ae4fd3853fe066d8a2f72e40a5062bcaf2b92496b6c83fc" exitCode=0 Feb 17 13:29:02 crc kubenswrapper[4804]: I0217 13:29:02.353113 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9k56" event={"ID":"dd3f4542-6055-4524-9e05-58b4c9a16e37","Type":"ContainerStarted","Data":"32268ca4bac2c1ba5e24c19291b99aeb8559e595975744c5d7c0ae06ab41c88b"} Feb 17 13:29:02 crc kubenswrapper[4804]: I0217 13:29:02.357105 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpw7w" event={"ID":"cbda9f29-b199-4a42-8757-f5ecc90f0437","Type":"ContainerStarted","Data":"0d6529764fb52a660f73a1c0bf0bbd57d05ce1c0989e42faa62fceb8a8e28c77"} Feb 17 13:29:02 crc kubenswrapper[4804]: I0217 13:29:02.361723 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xf58f" event={"ID":"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5","Type":"ContainerStarted","Data":"6f5ffb12c26b0fccbe16222c85dcb4bbe8bbf2181d09e1a07e43c582de974240"} Feb 17 13:29:02 crc kubenswrapper[4804]: I0217 13:29:02.374815 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f9k56" podStartSLOduration=4.169809769 podStartE2EDuration="1m5.374796065s" podCreationTimestamp="2026-02-17 13:27:57 +0000 UTC" firstStartedPulling="2026-02-17 13:28:00.833023831 +0000 UTC m=+154.944443168" lastFinishedPulling="2026-02-17 13:29:02.038010127 +0000 UTC m=+216.149429464" observedRunningTime="2026-02-17 13:29:02.374190334 +0000 UTC m=+216.485609681" watchObservedRunningTime="2026-02-17 13:29:02.374796065 +0000 UTC m=+216.486215402" Feb 17 13:29:02 crc kubenswrapper[4804]: I0217 13:29:02.398162 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hpw7w" podStartSLOduration=4.843019966 podStartE2EDuration="1m5.398146424s" podCreationTimestamp="2026-02-17 13:27:57 +0000 UTC" firstStartedPulling="2026-02-17 13:28:00.880369227 +0000 UTC m=+154.991788564" lastFinishedPulling="2026-02-17 13:29:01.435495685 +0000 UTC m=+215.546915022" observedRunningTime="2026-02-17 13:29:02.397362858 +0000 UTC m=+216.508782195" watchObservedRunningTime="2026-02-17 13:29:02.398146424 +0000 UTC m=+216.509565761" Feb 17 13:29:02 crc kubenswrapper[4804]: I0217 13:29:02.420684 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xf58f" podStartSLOduration=2.198926656 podStartE2EDuration="1m2.420664535s" podCreationTimestamp="2026-02-17 13:28:00 +0000 UTC" firstStartedPulling="2026-02-17 13:28:01.91673789 +0000 UTC m=+156.028157217" lastFinishedPulling="2026-02-17 13:29:02.138475759 +0000 UTC m=+216.249895096" observedRunningTime="2026-02-17 13:29:02.419825327 +0000 UTC m=+216.531244664" watchObservedRunningTime="2026-02-17 13:29:02.420664535 +0000 UTC m=+216.532083872" Feb 17 13:29:03 crc kubenswrapper[4804]: I0217 13:29:03.371067 4804 generic.go:334] "Generic (PLEG): container finished" podID="4627be0e-b7ba-4e46-820b-0ce1271ecacb" containerID="a14465d915fa294528de1e1a532d12f42a2b05c614c04dfaa5801608931bc3fa" exitCode=0 Feb 17 13:29:03 crc kubenswrapper[4804]: I0217 13:29:03.371159 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j44f8" event={"ID":"4627be0e-b7ba-4e46-820b-0ce1271ecacb","Type":"ContainerDied","Data":"a14465d915fa294528de1e1a532d12f42a2b05c614c04dfaa5801608931bc3fa"} Feb 17 13:29:03 crc kubenswrapper[4804]: I0217 13:29:03.376492 4804 generic.go:334] "Generic (PLEG): container finished" podID="af8f355f-84e5-49b0-83f4-b87ce7bb4015" containerID="6b673b46083a7a7e870939da823bebf898513e413a5e11d451d621999b90a4eb" exitCode=0 Feb 17 13:29:03 crc kubenswrapper[4804]: I0217 13:29:03.376531 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dfpnq" event={"ID":"af8f355f-84e5-49b0-83f4-b87ce7bb4015","Type":"ContainerDied","Data":"6b673b46083a7a7e870939da823bebf898513e413a5e11d451d621999b90a4eb"} Feb 17 13:29:04 crc kubenswrapper[4804]: I0217 13:29:04.383780 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j44f8" event={"ID":"4627be0e-b7ba-4e46-820b-0ce1271ecacb","Type":"ContainerStarted","Data":"f8d7f73baa1032b6de41da56ddc6f1f2dec8f46b8ff8b6b1cc83c93dff54365f"} Feb 17 13:29:04 crc kubenswrapper[4804]: I0217 13:29:04.386075 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4fxk" event={"ID":"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd","Type":"ContainerStarted","Data":"e1b53cd296296f2669228d6a1779fe692f64d2529d15b732912d752b28a4aeef"} Feb 17 13:29:04 crc kubenswrapper[4804]: I0217 13:29:04.388921 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dfpnq" event={"ID":"af8f355f-84e5-49b0-83f4-b87ce7bb4015","Type":"ContainerStarted","Data":"503926379bfc61a672b44215088d72cfe3108d43867dcdd3e3945371b4cab72f"} Feb 17 13:29:04 crc kubenswrapper[4804]: I0217 13:29:04.437745 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dfpnq" podStartSLOduration=3.435925797 podStartE2EDuration="1m6.437726475s" podCreationTimestamp="2026-02-17 13:27:58 +0000 UTC" firstStartedPulling="2026-02-17 13:28:00.83746957 +0000 UTC m=+154.948888907" lastFinishedPulling="2026-02-17 13:29:03.839270248 +0000 UTC m=+217.950689585" observedRunningTime="2026-02-17 13:29:04.436011218 +0000 UTC m=+218.547430565" watchObservedRunningTime="2026-02-17 13:29:04.437726475 +0000 UTC m=+218.549145812" Feb 17 13:29:04 crc kubenswrapper[4804]: I0217 13:29:04.438870 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j44f8" podStartSLOduration=2.3553082659999998 podStartE2EDuration="1m4.438863324s" podCreationTimestamp="2026-02-17 13:28:00 +0000 UTC" firstStartedPulling="2026-02-17 13:28:01.926532839 +0000 UTC m=+156.037952176" lastFinishedPulling="2026-02-17 13:29:04.010087897 +0000 UTC m=+218.121507234" observedRunningTime="2026-02-17 13:29:04.408581083 +0000 UTC m=+218.520000420" watchObservedRunningTime="2026-02-17 13:29:04.438863324 +0000 UTC m=+218.550282661" Feb 17 13:29:05 crc kubenswrapper[4804]: I0217 13:29:05.396695 4804 generic.go:334] "Generic (PLEG): container finished" podID="3d715b9f-61c8-4851-a4b1-452f9f3ea8bd" containerID="e1b53cd296296f2669228d6a1779fe692f64d2529d15b732912d752b28a4aeef" exitCode=0 Feb 17 13:29:05 crc kubenswrapper[4804]: I0217 13:29:05.396762 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4fxk" event={"ID":"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd","Type":"ContainerDied","Data":"e1b53cd296296f2669228d6a1779fe692f64d2529d15b732912d752b28a4aeef"} Feb 17 13:29:06 crc kubenswrapper[4804]: I0217 13:29:06.405617 4804 generic.go:334] "Generic (PLEG): container finished" podID="6a10f4e7-7906-43aa-98fb-e709a71a55d2" containerID="75097ed47129979b31be1fb063b6f429a2375e95db196451ba1b30153d8fc201" exitCode=0 Feb 17 13:29:06 crc kubenswrapper[4804]: I0217 13:29:06.406254 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fvtl6" event={"ID":"6a10f4e7-7906-43aa-98fb-e709a71a55d2","Type":"ContainerDied","Data":"75097ed47129979b31be1fb063b6f429a2375e95db196451ba1b30153d8fc201"} Feb 17 13:29:06 crc kubenswrapper[4804]: I0217 13:29:06.410687 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4fxk" event={"ID":"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd","Type":"ContainerStarted","Data":"79e41b90a52947db0ebb63d4e1e2052586e93f846d40771fb00d133db631541f"} Feb 17 13:29:06 crc kubenswrapper[4804]: I0217 13:29:06.441451 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-c4fxk" podStartSLOduration=1.5327347630000001 podStartE2EDuration="1m5.44143441s" podCreationTimestamp="2026-02-17 13:28:01 +0000 UTC" firstStartedPulling="2026-02-17 13:28:01.914457944 +0000 UTC m=+156.025877281" lastFinishedPulling="2026-02-17 13:29:05.823157591 +0000 UTC m=+219.934576928" observedRunningTime="2026-02-17 13:29:06.440472498 +0000 UTC m=+220.551891845" watchObservedRunningTime="2026-02-17 13:29:06.44143441 +0000 UTC m=+220.552853747" Feb 17 13:29:07 crc kubenswrapper[4804]: I0217 13:29:07.419853 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fvtl6" event={"ID":"6a10f4e7-7906-43aa-98fb-e709a71a55d2","Type":"ContainerStarted","Data":"9895e70116d773812fe7c81811ca37d9cf3877a5e4c2894683701b841e72852b"} Feb 17 13:29:07 crc kubenswrapper[4804]: I0217 13:29:07.441300 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fvtl6" podStartSLOduration=3.316118946 podStartE2EDuration="1m8.44128093s" podCreationTimestamp="2026-02-17 13:27:59 +0000 UTC" firstStartedPulling="2026-02-17 13:28:01.923078823 +0000 UTC m=+156.034498160" lastFinishedPulling="2026-02-17 13:29:07.048240807 +0000 UTC m=+221.159660144" observedRunningTime="2026-02-17 13:29:07.438623622 +0000 UTC m=+221.550042969" watchObservedRunningTime="2026-02-17 13:29:07.44128093 +0000 UTC m=+221.552700267" Feb 17 13:29:07 crc kubenswrapper[4804]: I0217 13:29:07.901243 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-54w49" Feb 17 13:29:07 crc kubenswrapper[4804]: I0217 13:29:07.901312 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-54w49" Feb 17 13:29:08 crc kubenswrapper[4804]: I0217 13:29:08.110614 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bstw9"] Feb 17 13:29:08 crc kubenswrapper[4804]: I0217 13:29:08.111159 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hpw7w" Feb 17 13:29:08 crc kubenswrapper[4804]: I0217 13:29:08.111289 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hpw7w" Feb 17 13:29:08 crc kubenswrapper[4804]: I0217 13:29:08.132838 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-54w49" Feb 17 13:29:08 crc kubenswrapper[4804]: I0217 13:29:08.180635 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hpw7w" Feb 17 13:29:08 crc kubenswrapper[4804]: I0217 13:29:08.325931 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f9k56" Feb 17 13:29:08 crc kubenswrapper[4804]: I0217 13:29:08.325985 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f9k56" Feb 17 13:29:08 crc kubenswrapper[4804]: I0217 13:29:08.384305 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f9k56" Feb 17 13:29:08 crc kubenswrapper[4804]: I0217 13:29:08.469947 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hpw7w" Feb 17 13:29:08 crc kubenswrapper[4804]: I0217 13:29:08.470942 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f9k56" Feb 17 13:29:08 crc kubenswrapper[4804]: I0217 13:29:08.478359 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-54w49" Feb 17 13:29:08 crc kubenswrapper[4804]: I0217 13:29:08.563731 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dfpnq" Feb 17 13:29:08 crc kubenswrapper[4804]: I0217 13:29:08.563885 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dfpnq" Feb 17 13:29:08 crc kubenswrapper[4804]: I0217 13:29:08.611590 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dfpnq" Feb 17 13:29:09 crc kubenswrapper[4804]: I0217 13:29:09.468222 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dfpnq" Feb 17 13:29:10 crc kubenswrapper[4804]: I0217 13:29:10.104654 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fvtl6" Feb 17 13:29:10 crc kubenswrapper[4804]: I0217 13:29:10.104701 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fvtl6" Feb 17 13:29:10 crc kubenswrapper[4804]: I0217 13:29:10.160020 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fvtl6" Feb 17 13:29:10 crc kubenswrapper[4804]: I0217 13:29:10.504316 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j44f8" Feb 17 13:29:10 crc kubenswrapper[4804]: I0217 13:29:10.504388 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j44f8" Feb 17 13:29:10 crc kubenswrapper[4804]: I0217 13:29:10.551785 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j44f8" Feb 17 13:29:11 crc kubenswrapper[4804]: I0217 13:29:11.180301 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xf58f" Feb 17 13:29:11 crc kubenswrapper[4804]: I0217 13:29:11.180572 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xf58f" Feb 17 13:29:11 crc kubenswrapper[4804]: I0217 13:29:11.234404 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xf58f" Feb 17 13:29:11 crc kubenswrapper[4804]: I0217 13:29:11.481945 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xf58f" Feb 17 13:29:11 crc kubenswrapper[4804]: I0217 13:29:11.482741 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j44f8" Feb 17 13:29:11 crc kubenswrapper[4804]: I0217 13:29:11.491676 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-c4fxk" Feb 17 13:29:11 crc kubenswrapper[4804]: I0217 13:29:11.491737 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-c4fxk" Feb 17 13:29:11 crc kubenswrapper[4804]: I0217 13:29:11.610012 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f9k56"] Feb 17 13:29:11 crc kubenswrapper[4804]: I0217 13:29:11.610296 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f9k56" podUID="dd3f4542-6055-4524-9e05-58b4c9a16e37" containerName="registry-server" containerID="cri-o://32268ca4bac2c1ba5e24c19291b99aeb8559e595975744c5d7c0ae06ab41c88b" gracePeriod=2 Feb 17 13:29:12 crc kubenswrapper[4804]: I0217 13:29:12.542172 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-c4fxk" podUID="3d715b9f-61c8-4851-a4b1-452f9f3ea8bd" containerName="registry-server" probeResult="failure" output=< Feb 17 13:29:12 crc kubenswrapper[4804]: timeout: failed to connect service ":50051" within 1s Feb 17 13:29:12 crc kubenswrapper[4804]: > Feb 17 13:29:13 crc kubenswrapper[4804]: I0217 13:29:13.011685 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dfpnq"] Feb 17 13:29:13 crc kubenswrapper[4804]: I0217 13:29:13.011945 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dfpnq" podUID="af8f355f-84e5-49b0-83f4-b87ce7bb4015" containerName="registry-server" containerID="cri-o://503926379bfc61a672b44215088d72cfe3108d43867dcdd3e3945371b4cab72f" gracePeriod=2 Feb 17 13:29:13 crc kubenswrapper[4804]: I0217 13:29:13.451382 4804 generic.go:334] "Generic (PLEG): container finished" podID="dd3f4542-6055-4524-9e05-58b4c9a16e37" containerID="32268ca4bac2c1ba5e24c19291b99aeb8559e595975744c5d7c0ae06ab41c88b" exitCode=0 Feb 17 13:29:13 crc kubenswrapper[4804]: I0217 13:29:13.451456 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9k56" event={"ID":"dd3f4542-6055-4524-9e05-58b4c9a16e37","Type":"ContainerDied","Data":"32268ca4bac2c1ba5e24c19291b99aeb8559e595975744c5d7c0ae06ab41c88b"} Feb 17 13:29:13 crc kubenswrapper[4804]: I0217 13:29:13.623773 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f9k56" Feb 17 13:29:13 crc kubenswrapper[4804]: I0217 13:29:13.797489 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd3f4542-6055-4524-9e05-58b4c9a16e37-catalog-content\") pod \"dd3f4542-6055-4524-9e05-58b4c9a16e37\" (UID: \"dd3f4542-6055-4524-9e05-58b4c9a16e37\") " Feb 17 13:29:13 crc kubenswrapper[4804]: I0217 13:29:13.797555 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nf9xm\" (UniqueName: \"kubernetes.io/projected/dd3f4542-6055-4524-9e05-58b4c9a16e37-kube-api-access-nf9xm\") pod \"dd3f4542-6055-4524-9e05-58b4c9a16e37\" (UID: \"dd3f4542-6055-4524-9e05-58b4c9a16e37\") " Feb 17 13:29:13 crc kubenswrapper[4804]: I0217 13:29:13.797676 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd3f4542-6055-4524-9e05-58b4c9a16e37-utilities\") pod \"dd3f4542-6055-4524-9e05-58b4c9a16e37\" (UID: \"dd3f4542-6055-4524-9e05-58b4c9a16e37\") " Feb 17 13:29:13 crc kubenswrapper[4804]: I0217 13:29:13.798661 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd3f4542-6055-4524-9e05-58b4c9a16e37-utilities" (OuterVolumeSpecName: "utilities") pod "dd3f4542-6055-4524-9e05-58b4c9a16e37" (UID: "dd3f4542-6055-4524-9e05-58b4c9a16e37"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:29:13 crc kubenswrapper[4804]: I0217 13:29:13.806664 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd3f4542-6055-4524-9e05-58b4c9a16e37-kube-api-access-nf9xm" (OuterVolumeSpecName: "kube-api-access-nf9xm") pod "dd3f4542-6055-4524-9e05-58b4c9a16e37" (UID: "dd3f4542-6055-4524-9e05-58b4c9a16e37"). InnerVolumeSpecName "kube-api-access-nf9xm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:29:13 crc kubenswrapper[4804]: I0217 13:29:13.858557 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd3f4542-6055-4524-9e05-58b4c9a16e37-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd3f4542-6055-4524-9e05-58b4c9a16e37" (UID: "dd3f4542-6055-4524-9e05-58b4c9a16e37"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:29:13 crc kubenswrapper[4804]: I0217 13:29:13.899882 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nf9xm\" (UniqueName: \"kubernetes.io/projected/dd3f4542-6055-4524-9e05-58b4c9a16e37-kube-api-access-nf9xm\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:13 crc kubenswrapper[4804]: I0217 13:29:13.899930 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd3f4542-6055-4524-9e05-58b4c9a16e37-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:13 crc kubenswrapper[4804]: I0217 13:29:13.900022 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd3f4542-6055-4524-9e05-58b4c9a16e37-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.009420 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j44f8"] Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.009709 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j44f8" podUID="4627be0e-b7ba-4e46-820b-0ce1271ecacb" containerName="registry-server" containerID="cri-o://f8d7f73baa1032b6de41da56ddc6f1f2dec8f46b8ff8b6b1cc83c93dff54365f" gracePeriod=2 Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.462053 4804 generic.go:334] "Generic (PLEG): container finished" podID="af8f355f-84e5-49b0-83f4-b87ce7bb4015" containerID="503926379bfc61a672b44215088d72cfe3108d43867dcdd3e3945371b4cab72f" exitCode=0 Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.462138 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dfpnq" event={"ID":"af8f355f-84e5-49b0-83f4-b87ce7bb4015","Type":"ContainerDied","Data":"503926379bfc61a672b44215088d72cfe3108d43867dcdd3e3945371b4cab72f"} Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.465072 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9k56" event={"ID":"dd3f4542-6055-4524-9e05-58b4c9a16e37","Type":"ContainerDied","Data":"21bf4e05af6fa23bdde7a029ebf7c31d1a22cc2791c5a01af78f87549037e881"} Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.465115 4804 scope.go:117] "RemoveContainer" containerID="32268ca4bac2c1ba5e24c19291b99aeb8559e595975744c5d7c0ae06ab41c88b" Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.465343 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f9k56" Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.471134 4804 generic.go:334] "Generic (PLEG): container finished" podID="4627be0e-b7ba-4e46-820b-0ce1271ecacb" containerID="f8d7f73baa1032b6de41da56ddc6f1f2dec8f46b8ff8b6b1cc83c93dff54365f" exitCode=0 Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.471178 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j44f8" event={"ID":"4627be0e-b7ba-4e46-820b-0ce1271ecacb","Type":"ContainerDied","Data":"f8d7f73baa1032b6de41da56ddc6f1f2dec8f46b8ff8b6b1cc83c93dff54365f"} Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.503802 4804 scope.go:117] "RemoveContainer" containerID="01f4c85dd4ed77fe0ae4fd3853fe066d8a2f72e40a5062bcaf2b92496b6c83fc" Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.506530 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f9k56"] Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.509092 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f9k56"] Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.525953 4804 scope.go:117] "RemoveContainer" containerID="9da518d6a4ba94c30fc4e543aae3a6e806450f9d2bafc8157ce03ab22879d7ef" Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.579632 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd3f4542-6055-4524-9e05-58b4c9a16e37" path="/var/lib/kubelet/pods/dd3f4542-6055-4524-9e05-58b4c9a16e37/volumes" Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.692169 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dfpnq" Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.810468 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hh9rr\" (UniqueName: \"kubernetes.io/projected/af8f355f-84e5-49b0-83f4-b87ce7bb4015-kube-api-access-hh9rr\") pod \"af8f355f-84e5-49b0-83f4-b87ce7bb4015\" (UID: \"af8f355f-84e5-49b0-83f4-b87ce7bb4015\") " Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.810547 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af8f355f-84e5-49b0-83f4-b87ce7bb4015-catalog-content\") pod \"af8f355f-84e5-49b0-83f4-b87ce7bb4015\" (UID: \"af8f355f-84e5-49b0-83f4-b87ce7bb4015\") " Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.810594 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af8f355f-84e5-49b0-83f4-b87ce7bb4015-utilities\") pod \"af8f355f-84e5-49b0-83f4-b87ce7bb4015\" (UID: \"af8f355f-84e5-49b0-83f4-b87ce7bb4015\") " Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.812238 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af8f355f-84e5-49b0-83f4-b87ce7bb4015-utilities" (OuterVolumeSpecName: "utilities") pod "af8f355f-84e5-49b0-83f4-b87ce7bb4015" (UID: "af8f355f-84e5-49b0-83f4-b87ce7bb4015"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.817702 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af8f355f-84e5-49b0-83f4-b87ce7bb4015-kube-api-access-hh9rr" (OuterVolumeSpecName: "kube-api-access-hh9rr") pod "af8f355f-84e5-49b0-83f4-b87ce7bb4015" (UID: "af8f355f-84e5-49b0-83f4-b87ce7bb4015"). InnerVolumeSpecName "kube-api-access-hh9rr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.904941 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af8f355f-84e5-49b0-83f4-b87ce7bb4015-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af8f355f-84e5-49b0-83f4-b87ce7bb4015" (UID: "af8f355f-84e5-49b0-83f4-b87ce7bb4015"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.911488 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hh9rr\" (UniqueName: \"kubernetes.io/projected/af8f355f-84e5-49b0-83f4-b87ce7bb4015-kube-api-access-hh9rr\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.911568 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af8f355f-84e5-49b0-83f4-b87ce7bb4015-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:14 crc kubenswrapper[4804]: I0217 13:29:14.911581 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af8f355f-84e5-49b0-83f4-b87ce7bb4015-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.265422 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j44f8" Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.417905 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4627be0e-b7ba-4e46-820b-0ce1271ecacb-utilities\") pod \"4627be0e-b7ba-4e46-820b-0ce1271ecacb\" (UID: \"4627be0e-b7ba-4e46-820b-0ce1271ecacb\") " Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.417996 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsj25\" (UniqueName: \"kubernetes.io/projected/4627be0e-b7ba-4e46-820b-0ce1271ecacb-kube-api-access-rsj25\") pod \"4627be0e-b7ba-4e46-820b-0ce1271ecacb\" (UID: \"4627be0e-b7ba-4e46-820b-0ce1271ecacb\") " Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.418115 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4627be0e-b7ba-4e46-820b-0ce1271ecacb-catalog-content\") pod \"4627be0e-b7ba-4e46-820b-0ce1271ecacb\" (UID: \"4627be0e-b7ba-4e46-820b-0ce1271ecacb\") " Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.419277 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4627be0e-b7ba-4e46-820b-0ce1271ecacb-utilities" (OuterVolumeSpecName: "utilities") pod "4627be0e-b7ba-4e46-820b-0ce1271ecacb" (UID: "4627be0e-b7ba-4e46-820b-0ce1271ecacb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.422157 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4627be0e-b7ba-4e46-820b-0ce1271ecacb-kube-api-access-rsj25" (OuterVolumeSpecName: "kube-api-access-rsj25") pod "4627be0e-b7ba-4e46-820b-0ce1271ecacb" (UID: "4627be0e-b7ba-4e46-820b-0ce1271ecacb"). InnerVolumeSpecName "kube-api-access-rsj25". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.448711 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4627be0e-b7ba-4e46-820b-0ce1271ecacb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4627be0e-b7ba-4e46-820b-0ce1271ecacb" (UID: "4627be0e-b7ba-4e46-820b-0ce1271ecacb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.478499 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dfpnq" Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.478488 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dfpnq" event={"ID":"af8f355f-84e5-49b0-83f4-b87ce7bb4015","Type":"ContainerDied","Data":"be09cbde5111c6442fb7580667b29d0357b1495c50edff7352458e4b0ddab9db"} Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.478833 4804 scope.go:117] "RemoveContainer" containerID="503926379bfc61a672b44215088d72cfe3108d43867dcdd3e3945371b4cab72f" Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.486441 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j44f8" event={"ID":"4627be0e-b7ba-4e46-820b-0ce1271ecacb","Type":"ContainerDied","Data":"c5910c70e84a82abe005c7000c40085a9ab0598685cbc3225b9df0cad35f66af"} Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.486476 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j44f8" Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.512750 4804 scope.go:117] "RemoveContainer" containerID="6b673b46083a7a7e870939da823bebf898513e413a5e11d451d621999b90a4eb" Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.512906 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dfpnq"] Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.514875 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dfpnq"] Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.521336 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsj25\" (UniqueName: \"kubernetes.io/projected/4627be0e-b7ba-4e46-820b-0ce1271ecacb-kube-api-access-rsj25\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.521365 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4627be0e-b7ba-4e46-820b-0ce1271ecacb-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.521374 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4627be0e-b7ba-4e46-820b-0ce1271ecacb-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.523182 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j44f8"] Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.528241 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j44f8"] Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.549839 4804 scope.go:117] "RemoveContainer" containerID="0a5fa9448a9b147d71180506aad70bb2187e4381cb523e0918b556f39008479f" Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.570383 4804 scope.go:117] "RemoveContainer" containerID="f8d7f73baa1032b6de41da56ddc6f1f2dec8f46b8ff8b6b1cc83c93dff54365f" Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.594843 4804 scope.go:117] "RemoveContainer" containerID="a14465d915fa294528de1e1a532d12f42a2b05c614c04dfaa5801608931bc3fa" Feb 17 13:29:15 crc kubenswrapper[4804]: I0217 13:29:15.610966 4804 scope.go:117] "RemoveContainer" containerID="fd63f395d9d2acc2a5229430110a217a86178b2333399d07e264a3b4cbc4fc4b" Feb 17 13:29:16 crc kubenswrapper[4804]: I0217 13:29:16.580226 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4627be0e-b7ba-4e46-820b-0ce1271ecacb" path="/var/lib/kubelet/pods/4627be0e-b7ba-4e46-820b-0ce1271ecacb/volumes" Feb 17 13:29:16 crc kubenswrapper[4804]: I0217 13:29:16.581121 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af8f355f-84e5-49b0-83f4-b87ce7bb4015" path="/var/lib/kubelet/pods/af8f355f-84e5-49b0-83f4-b87ce7bb4015/volumes" Feb 17 13:29:20 crc kubenswrapper[4804]: I0217 13:29:20.157636 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fvtl6" Feb 17 13:29:21 crc kubenswrapper[4804]: I0217 13:29:21.534276 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-c4fxk" Feb 17 13:29:21 crc kubenswrapper[4804]: I0217 13:29:21.572272 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-c4fxk" Feb 17 13:29:22 crc kubenswrapper[4804]: I0217 13:29:22.410662 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c4fxk"] Feb 17 13:29:23 crc kubenswrapper[4804]: I0217 13:29:23.523924 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-c4fxk" podUID="3d715b9f-61c8-4851-a4b1-452f9f3ea8bd" containerName="registry-server" containerID="cri-o://79e41b90a52947db0ebb63d4e1e2052586e93f846d40771fb00d133db631541f" gracePeriod=2 Feb 17 13:29:23 crc kubenswrapper[4804]: I0217 13:29:23.884386 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c4fxk" Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.042833 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d715b9f-61c8-4851-a4b1-452f9f3ea8bd-utilities\") pod \"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd\" (UID: \"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd\") " Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.043160 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d715b9f-61c8-4851-a4b1-452f9f3ea8bd-catalog-content\") pod \"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd\" (UID: \"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd\") " Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.043258 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmjk4\" (UniqueName: \"kubernetes.io/projected/3d715b9f-61c8-4851-a4b1-452f9f3ea8bd-kube-api-access-jmjk4\") pod \"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd\" (UID: \"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd\") " Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.043815 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d715b9f-61c8-4851-a4b1-452f9f3ea8bd-utilities" (OuterVolumeSpecName: "utilities") pod "3d715b9f-61c8-4851-a4b1-452f9f3ea8bd" (UID: "3d715b9f-61c8-4851-a4b1-452f9f3ea8bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.049217 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d715b9f-61c8-4851-a4b1-452f9f3ea8bd-kube-api-access-jmjk4" (OuterVolumeSpecName: "kube-api-access-jmjk4") pod "3d715b9f-61c8-4851-a4b1-452f9f3ea8bd" (UID: "3d715b9f-61c8-4851-a4b1-452f9f3ea8bd"). InnerVolumeSpecName "kube-api-access-jmjk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.144810 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmjk4\" (UniqueName: \"kubernetes.io/projected/3d715b9f-61c8-4851-a4b1-452f9f3ea8bd-kube-api-access-jmjk4\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.144847 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d715b9f-61c8-4851-a4b1-452f9f3ea8bd-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.169234 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d715b9f-61c8-4851-a4b1-452f9f3ea8bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d715b9f-61c8-4851-a4b1-452f9f3ea8bd" (UID: "3d715b9f-61c8-4851-a4b1-452f9f3ea8bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.245921 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d715b9f-61c8-4851-a4b1-452f9f3ea8bd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.533289 4804 generic.go:334] "Generic (PLEG): container finished" podID="3d715b9f-61c8-4851-a4b1-452f9f3ea8bd" containerID="79e41b90a52947db0ebb63d4e1e2052586e93f846d40771fb00d133db631541f" exitCode=0 Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.533332 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4fxk" event={"ID":"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd","Type":"ContainerDied","Data":"79e41b90a52947db0ebb63d4e1e2052586e93f846d40771fb00d133db631541f"} Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.533351 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c4fxk" Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.533370 4804 scope.go:117] "RemoveContainer" containerID="79e41b90a52947db0ebb63d4e1e2052586e93f846d40771fb00d133db631541f" Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.533358 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4fxk" event={"ID":"3d715b9f-61c8-4851-a4b1-452f9f3ea8bd","Type":"ContainerDied","Data":"7e1b2fb29927815e4957ff56f7ae370566373e378aef77389a1de5a8d2809eef"} Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.550693 4804 scope.go:117] "RemoveContainer" containerID="e1b53cd296296f2669228d6a1779fe692f64d2529d15b732912d752b28a4aeef" Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.564400 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c4fxk"] Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.567385 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-c4fxk"] Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.580383 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d715b9f-61c8-4851-a4b1-452f9f3ea8bd" path="/var/lib/kubelet/pods/3d715b9f-61c8-4851-a4b1-452f9f3ea8bd/volumes" Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.586389 4804 scope.go:117] "RemoveContainer" containerID="0a2e7e7738570ae3eaeab55480ad74079b7d1d3eba43c78c570d3a883e196613" Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.601313 4804 scope.go:117] "RemoveContainer" containerID="79e41b90a52947db0ebb63d4e1e2052586e93f846d40771fb00d133db631541f" Feb 17 13:29:24 crc kubenswrapper[4804]: E0217 13:29:24.601745 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79e41b90a52947db0ebb63d4e1e2052586e93f846d40771fb00d133db631541f\": container with ID starting with 79e41b90a52947db0ebb63d4e1e2052586e93f846d40771fb00d133db631541f not found: ID does not exist" containerID="79e41b90a52947db0ebb63d4e1e2052586e93f846d40771fb00d133db631541f" Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.601789 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79e41b90a52947db0ebb63d4e1e2052586e93f846d40771fb00d133db631541f"} err="failed to get container status \"79e41b90a52947db0ebb63d4e1e2052586e93f846d40771fb00d133db631541f\": rpc error: code = NotFound desc = could not find container \"79e41b90a52947db0ebb63d4e1e2052586e93f846d40771fb00d133db631541f\": container with ID starting with 79e41b90a52947db0ebb63d4e1e2052586e93f846d40771fb00d133db631541f not found: ID does not exist" Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.601821 4804 scope.go:117] "RemoveContainer" containerID="e1b53cd296296f2669228d6a1779fe692f64d2529d15b732912d752b28a4aeef" Feb 17 13:29:24 crc kubenswrapper[4804]: E0217 13:29:24.602208 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1b53cd296296f2669228d6a1779fe692f64d2529d15b732912d752b28a4aeef\": container with ID starting with e1b53cd296296f2669228d6a1779fe692f64d2529d15b732912d752b28a4aeef not found: ID does not exist" containerID="e1b53cd296296f2669228d6a1779fe692f64d2529d15b732912d752b28a4aeef" Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.602238 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1b53cd296296f2669228d6a1779fe692f64d2529d15b732912d752b28a4aeef"} err="failed to get container status \"e1b53cd296296f2669228d6a1779fe692f64d2529d15b732912d752b28a4aeef\": rpc error: code = NotFound desc = could not find container \"e1b53cd296296f2669228d6a1779fe692f64d2529d15b732912d752b28a4aeef\": container with ID starting with e1b53cd296296f2669228d6a1779fe692f64d2529d15b732912d752b28a4aeef not found: ID does not exist" Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.602253 4804 scope.go:117] "RemoveContainer" containerID="0a2e7e7738570ae3eaeab55480ad74079b7d1d3eba43c78c570d3a883e196613" Feb 17 13:29:24 crc kubenswrapper[4804]: E0217 13:29:24.602485 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a2e7e7738570ae3eaeab55480ad74079b7d1d3eba43c78c570d3a883e196613\": container with ID starting with 0a2e7e7738570ae3eaeab55480ad74079b7d1d3eba43c78c570d3a883e196613 not found: ID does not exist" containerID="0a2e7e7738570ae3eaeab55480ad74079b7d1d3eba43c78c570d3a883e196613" Feb 17 13:29:24 crc kubenswrapper[4804]: I0217 13:29:24.602509 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a2e7e7738570ae3eaeab55480ad74079b7d1d3eba43c78c570d3a883e196613"} err="failed to get container status \"0a2e7e7738570ae3eaeab55480ad74079b7d1d3eba43c78c570d3a883e196613\": rpc error: code = NotFound desc = could not find container \"0a2e7e7738570ae3eaeab55480ad74079b7d1d3eba43c78c570d3a883e196613\": container with ID starting with 0a2e7e7738570ae3eaeab55480ad74079b7d1d3eba43c78c570d3a883e196613 not found: ID does not exist" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.883532 4804 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 13:29:27 crc kubenswrapper[4804]: E0217 13:29:27.884061 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd3f4542-6055-4524-9e05-58b4c9a16e37" containerName="registry-server" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.884081 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd3f4542-6055-4524-9e05-58b4c9a16e37" containerName="registry-server" Feb 17 13:29:27 crc kubenswrapper[4804]: E0217 13:29:27.884101 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd3f4542-6055-4524-9e05-58b4c9a16e37" containerName="extract-utilities" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.884109 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd3f4542-6055-4524-9e05-58b4c9a16e37" containerName="extract-utilities" Feb 17 13:29:27 crc kubenswrapper[4804]: E0217 13:29:27.884120 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af8f355f-84e5-49b0-83f4-b87ce7bb4015" containerName="registry-server" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.884127 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="af8f355f-84e5-49b0-83f4-b87ce7bb4015" containerName="registry-server" Feb 17 13:29:27 crc kubenswrapper[4804]: E0217 13:29:27.884139 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4627be0e-b7ba-4e46-820b-0ce1271ecacb" containerName="registry-server" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.884144 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="4627be0e-b7ba-4e46-820b-0ce1271ecacb" containerName="registry-server" Feb 17 13:29:27 crc kubenswrapper[4804]: E0217 13:29:27.884151 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d715b9f-61c8-4851-a4b1-452f9f3ea8bd" containerName="registry-server" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.884157 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d715b9f-61c8-4851-a4b1-452f9f3ea8bd" containerName="registry-server" Feb 17 13:29:27 crc kubenswrapper[4804]: E0217 13:29:27.884166 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af8f355f-84e5-49b0-83f4-b87ce7bb4015" containerName="extract-content" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.884172 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="af8f355f-84e5-49b0-83f4-b87ce7bb4015" containerName="extract-content" Feb 17 13:29:27 crc kubenswrapper[4804]: E0217 13:29:27.884179 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d715b9f-61c8-4851-a4b1-452f9f3ea8bd" containerName="extract-utilities" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.884185 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d715b9f-61c8-4851-a4b1-452f9f3ea8bd" containerName="extract-utilities" Feb 17 13:29:27 crc kubenswrapper[4804]: E0217 13:29:27.884238 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4627be0e-b7ba-4e46-820b-0ce1271ecacb" containerName="extract-utilities" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.884247 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="4627be0e-b7ba-4e46-820b-0ce1271ecacb" containerName="extract-utilities" Feb 17 13:29:27 crc kubenswrapper[4804]: E0217 13:29:27.884260 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d715b9f-61c8-4851-a4b1-452f9f3ea8bd" containerName="extract-content" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.884267 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d715b9f-61c8-4851-a4b1-452f9f3ea8bd" containerName="extract-content" Feb 17 13:29:27 crc kubenswrapper[4804]: E0217 13:29:27.884279 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af8f355f-84e5-49b0-83f4-b87ce7bb4015" containerName="extract-utilities" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.884287 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="af8f355f-84e5-49b0-83f4-b87ce7bb4015" containerName="extract-utilities" Feb 17 13:29:27 crc kubenswrapper[4804]: E0217 13:29:27.884297 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd3f4542-6055-4524-9e05-58b4c9a16e37" containerName="extract-content" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.884305 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd3f4542-6055-4524-9e05-58b4c9a16e37" containerName="extract-content" Feb 17 13:29:27 crc kubenswrapper[4804]: E0217 13:29:27.884313 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4627be0e-b7ba-4e46-820b-0ce1271ecacb" containerName="extract-content" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.884320 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="4627be0e-b7ba-4e46-820b-0ce1271ecacb" containerName="extract-content" Feb 17 13:29:27 crc kubenswrapper[4804]: E0217 13:29:27.884331 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dda4da8-c5ea-4c8a-8443-d7e31eba95af" containerName="pruner" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.884338 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dda4da8-c5ea-4c8a-8443-d7e31eba95af" containerName="pruner" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.884435 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd3f4542-6055-4524-9e05-58b4c9a16e37" containerName="registry-server" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.884452 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="af8f355f-84e5-49b0-83f4-b87ce7bb4015" containerName="registry-server" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.884461 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d715b9f-61c8-4851-a4b1-452f9f3ea8bd" containerName="registry-server" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.884470 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="4627be0e-b7ba-4e46-820b-0ce1271ecacb" containerName="registry-server" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.884477 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dda4da8-c5ea-4c8a-8443-d7e31eba95af" containerName="pruner" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.884760 4804 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.884996 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094" gracePeriod=15 Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.885150 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.885390 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81" gracePeriod=15 Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.885561 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc" gracePeriod=15 Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.885603 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c" gracePeriod=15 Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.885670 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8" gracePeriod=15 Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.887456 4804 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 13:29:27 crc kubenswrapper[4804]: E0217 13:29:27.887593 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.887607 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 13:29:27 crc kubenswrapper[4804]: E0217 13:29:27.887616 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.887623 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 17 13:29:27 crc kubenswrapper[4804]: E0217 13:29:27.887632 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.887638 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 17 13:29:27 crc kubenswrapper[4804]: E0217 13:29:27.887651 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.887656 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 17 13:29:27 crc kubenswrapper[4804]: E0217 13:29:27.887668 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.887673 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 17 13:29:27 crc kubenswrapper[4804]: E0217 13:29:27.887682 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.887688 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.887772 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.887782 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.887790 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.887797 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.887806 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 17 13:29:27 crc kubenswrapper[4804]: E0217 13:29:27.887893 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.887900 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.887982 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.896816 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.896874 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.896915 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.896948 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.897008 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.897107 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.897135 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.897175 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 13:29:27 crc kubenswrapper[4804]: I0217 13:29:27.921589 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.000677 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.000721 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.000756 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.000778 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.000789 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.000754 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.000848 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.000844 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.000863 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.000810 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.000990 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.001051 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.001069 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.001116 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.001119 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.001154 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.217421 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 13:29:28 crc kubenswrapper[4804]: W0217 13:29:28.240682 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-f94861c9c3bcc3479971050ad3ef5bf6863ff8dfc0c2711385edc92b0df91739 WatchSource:0}: Error finding container f94861c9c3bcc3479971050ad3ef5bf6863ff8dfc0c2711385edc92b0df91739: Status 404 returned error can't find the container with id f94861c9c3bcc3479971050ad3ef5bf6863ff8dfc0c2711385edc92b0df91739 Feb 17 13:29:28 crc kubenswrapper[4804]: E0217 13:29:28.244695 4804 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.146:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18950bc4c85f2707 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 13:29:28.243332871 +0000 UTC m=+242.354752198,LastTimestamp:2026-02-17 13:29:28.243332871 +0000 UTC m=+242.354752198,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.568678 4804 generic.go:334] "Generic (PLEG): container finished" podID="1c7ffc91-beb4-48c9-bd6a-3432eb40cb18" containerID="c268cbeacb8edca4cf6be1f9ade9d17e4f9a777b74947e1265bd5b8b02378689" exitCode=0 Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.568786 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1c7ffc91-beb4-48c9-bd6a-3432eb40cb18","Type":"ContainerDied","Data":"c268cbeacb8edca4cf6be1f9ade9d17e4f9a777b74947e1265bd5b8b02378689"} Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.570740 4804 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.571168 4804 status_manager.go:851] "Failed to get status for pod" podUID="1c7ffc91-beb4-48c9-bd6a-3432eb40cb18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.571603 4804 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.583149 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.584411 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.584980 4804 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81" exitCode=0 Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.585011 4804 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c" exitCode=0 Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.585018 4804 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc" exitCode=0 Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.585027 4804 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8" exitCode=2 Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.587171 4804 status_manager.go:851] "Failed to get status for pod" podUID="1c7ffc91-beb4-48c9-bd6a-3432eb40cb18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.587787 4804 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.588273 4804 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.589078 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"2b3550f4627e08b0ac9b2c3b416d986c65ee31913e0c64bd49e44e832ba5e1bf"} Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.589115 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"f94861c9c3bcc3479971050ad3ef5bf6863ff8dfc0c2711385edc92b0df91739"} Feb 17 13:29:28 crc kubenswrapper[4804]: I0217 13:29:28.589132 4804 scope.go:117] "RemoveContainer" containerID="f1152acc5de4a066908c513b0ef2710ec49f6e6d9f3933ca2421c00984721533" Feb 17 13:29:29 crc kubenswrapper[4804]: E0217 13:29:29.438260 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:29:29Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:29:29Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:29:29Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:29:29Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:77c09c30acdeaaf95ab463052841d32404d264d7b46bead6207afe51848d25e3\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:b7b252dee7cfed79b278bcdec32ab88d70e98e83e6c0db9565a87d9e962cfecb\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1701350082},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:14398311b101163ddd1de78c093e161c5d3c9aac51a04e3d3d842fca6317ab0f\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:5a091792b99bf4dfaec25f4c8e29da579e2f452d48b924c8323a18accb7f3290\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1234637517},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:ad77d0ead8abca8b884fad3be18215dbe8b4f8f098053551e4a899298cf5c918\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:b5338e2ca87e0b47fec93f55559f0ed6b39eef3ed3b7f085a4f0b205ccb86a5d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1213306565},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:28df36269fc553eb1adba5566d6dfc258a1a74063c4cfe8b5bdd3f202591cf56\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:7fa59a55753e6c646b3b56a1a7080a5d70767fb964f1857c411fdf4e05ad4c71\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1201887930},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:29 crc kubenswrapper[4804]: E0217 13:29:29.439326 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:29 crc kubenswrapper[4804]: E0217 13:29:29.439599 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:29 crc kubenswrapper[4804]: E0217 13:29:29.439741 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:29 crc kubenswrapper[4804]: E0217 13:29:29.439880 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:29 crc kubenswrapper[4804]: E0217 13:29:29.439893 4804 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 13:29:29 crc kubenswrapper[4804]: I0217 13:29:29.596604 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 13:29:29 crc kubenswrapper[4804]: I0217 13:29:29.901697 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 13:29:29 crc kubenswrapper[4804]: I0217 13:29:29.902284 4804 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:29 crc kubenswrapper[4804]: I0217 13:29:29.902565 4804 status_manager.go:851] "Failed to get status for pod" podUID="1c7ffc91-beb4-48c9-bd6a-3432eb40cb18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.023121 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1c7ffc91-beb4-48c9-bd6a-3432eb40cb18-var-lock\") pod \"1c7ffc91-beb4-48c9-bd6a-3432eb40cb18\" (UID: \"1c7ffc91-beb4-48c9-bd6a-3432eb40cb18\") " Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.023639 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c7ffc91-beb4-48c9-bd6a-3432eb40cb18-kubelet-dir\") pod \"1c7ffc91-beb4-48c9-bd6a-3432eb40cb18\" (UID: \"1c7ffc91-beb4-48c9-bd6a-3432eb40cb18\") " Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.023738 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c7ffc91-beb4-48c9-bd6a-3432eb40cb18-kube-api-access\") pod \"1c7ffc91-beb4-48c9-bd6a-3432eb40cb18\" (UID: \"1c7ffc91-beb4-48c9-bd6a-3432eb40cb18\") " Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.025073 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c7ffc91-beb4-48c9-bd6a-3432eb40cb18-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1c7ffc91-beb4-48c9-bd6a-3432eb40cb18" (UID: "1c7ffc91-beb4-48c9-bd6a-3432eb40cb18"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.026080 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c7ffc91-beb4-48c9-bd6a-3432eb40cb18-var-lock" (OuterVolumeSpecName: "var-lock") pod "1c7ffc91-beb4-48c9-bd6a-3432eb40cb18" (UID: "1c7ffc91-beb4-48c9-bd6a-3432eb40cb18"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.050806 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c7ffc91-beb4-48c9-bd6a-3432eb40cb18-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1c7ffc91-beb4-48c9-bd6a-3432eb40cb18" (UID: "1c7ffc91-beb4-48c9-bd6a-3432eb40cb18"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.125388 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c7ffc91-beb4-48c9-bd6a-3432eb40cb18-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.125425 4804 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1c7ffc91-beb4-48c9-bd6a-3432eb40cb18-var-lock\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.125530 4804 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c7ffc91-beb4-48c9-bd6a-3432eb40cb18-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.271284 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.272432 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.273045 4804 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.273645 4804 status_manager.go:851] "Failed to get status for pod" podUID="1c7ffc91-beb4-48c9-bd6a-3432eb40cb18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.274172 4804 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.429381 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.429502 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.429528 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.429557 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.429592 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.429707 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.430181 4804 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.430262 4804 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.430283 4804 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.583555 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.610824 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.612088 4804 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094" exitCode=0 Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.612235 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.612277 4804 scope.go:117] "RemoveContainer" containerID="a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.612940 4804 status_manager.go:851] "Failed to get status for pod" podUID="1c7ffc91-beb4-48c9-bd6a-3432eb40cb18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.613593 4804 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.614280 4804 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.614918 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1c7ffc91-beb4-48c9-bd6a-3432eb40cb18","Type":"ContainerDied","Data":"c1dfeb9653f0eeda2384bf17db4b08547b2f5b249fa71fc9c90c6e17ca6502f1"} Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.614978 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1dfeb9653f0eeda2384bf17db4b08547b2f5b249fa71fc9c90c6e17ca6502f1" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.614981 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.619652 4804 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.620116 4804 status_manager.go:851] "Failed to get status for pod" podUID="1c7ffc91-beb4-48c9-bd6a-3432eb40cb18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.620367 4804 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.623636 4804 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.624019 4804 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.624547 4804 status_manager.go:851] "Failed to get status for pod" podUID="1c7ffc91-beb4-48c9-bd6a-3432eb40cb18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.637785 4804 scope.go:117] "RemoveContainer" containerID="5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.660024 4804 scope.go:117] "RemoveContainer" containerID="2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.680601 4804 scope.go:117] "RemoveContainer" containerID="b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.703377 4804 scope.go:117] "RemoveContainer" containerID="93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.724782 4804 scope.go:117] "RemoveContainer" containerID="3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.753240 4804 scope.go:117] "RemoveContainer" containerID="a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81" Feb 17 13:29:30 crc kubenswrapper[4804]: E0217 13:29:30.754002 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\": container with ID starting with a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81 not found: ID does not exist" containerID="a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.754055 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81"} err="failed to get container status \"a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\": rpc error: code = NotFound desc = could not find container \"a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81\": container with ID starting with a933bd130eda6ec92f47ca1d4a9ffce7c3b6c6e13e3beb802dc19301c3bccc81 not found: ID does not exist" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.754098 4804 scope.go:117] "RemoveContainer" containerID="5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c" Feb 17 13:29:30 crc kubenswrapper[4804]: E0217 13:29:30.754711 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\": container with ID starting with 5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c not found: ID does not exist" containerID="5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.754744 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c"} err="failed to get container status \"5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\": rpc error: code = NotFound desc = could not find container \"5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c\": container with ID starting with 5ed26a0f0242f802b5ffc26b549b1e3ccbab2c65ec8b5eb8f3496e283092377c not found: ID does not exist" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.754772 4804 scope.go:117] "RemoveContainer" containerID="2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc" Feb 17 13:29:30 crc kubenswrapper[4804]: E0217 13:29:30.755121 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\": container with ID starting with 2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc not found: ID does not exist" containerID="2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.755167 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc"} err="failed to get container status \"2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\": rpc error: code = NotFound desc = could not find container \"2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc\": container with ID starting with 2113004a96dc24e17644bd15a2f7896576336a147c814581232c69478e70dbdc not found: ID does not exist" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.755218 4804 scope.go:117] "RemoveContainer" containerID="b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8" Feb 17 13:29:30 crc kubenswrapper[4804]: E0217 13:29:30.755622 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\": container with ID starting with b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8 not found: ID does not exist" containerID="b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.755660 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8"} err="failed to get container status \"b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\": rpc error: code = NotFound desc = could not find container \"b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8\": container with ID starting with b880a11ba8a27ef883d7710c6ebc4b81cebcb8306d7cb57679aef5289e793cf8 not found: ID does not exist" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.755679 4804 scope.go:117] "RemoveContainer" containerID="93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094" Feb 17 13:29:30 crc kubenswrapper[4804]: E0217 13:29:30.756010 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\": container with ID starting with 93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094 not found: ID does not exist" containerID="93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.756038 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094"} err="failed to get container status \"93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\": rpc error: code = NotFound desc = could not find container \"93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094\": container with ID starting with 93824acd532e76cab5bff14eb9b407649e6ffc6321e1493ab91960a9c43b2094 not found: ID does not exist" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.756058 4804 scope.go:117] "RemoveContainer" containerID="3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8" Feb 17 13:29:30 crc kubenswrapper[4804]: E0217 13:29:30.756374 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\": container with ID starting with 3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8 not found: ID does not exist" containerID="3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8" Feb 17 13:29:30 crc kubenswrapper[4804]: I0217 13:29:30.756399 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8"} err="failed to get container status \"3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\": rpc error: code = NotFound desc = could not find container \"3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8\": container with ID starting with 3d983b6c0ccc4100875f86d2e3c69b602ecf69dc28883026162edc60e6a231c8 not found: ID does not exist" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.144831 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" podUID="81f879fe-7bd1-42d0-b026-80f901641a0b" containerName="oauth-openshift" containerID="cri-o://50587ee39943ecdd522f24486a8c7642f0b337a62b3b45381bc4dd36af224c55" gracePeriod=15 Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.564831 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.566017 4804 status_manager.go:851] "Failed to get status for pod" podUID="81f879fe-7bd1-42d0-b026-80f901641a0b" pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bstw9\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.566237 4804 status_manager.go:851] "Failed to get status for pod" podUID="1c7ffc91-beb4-48c9-bd6a-3432eb40cb18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.566536 4804 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.641012 4804 generic.go:334] "Generic (PLEG): container finished" podID="81f879fe-7bd1-42d0-b026-80f901641a0b" containerID="50587ee39943ecdd522f24486a8c7642f0b337a62b3b45381bc4dd36af224c55" exitCode=0 Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.641088 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.641100 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" event={"ID":"81f879fe-7bd1-42d0-b026-80f901641a0b","Type":"ContainerDied","Data":"50587ee39943ecdd522f24486a8c7642f0b337a62b3b45381bc4dd36af224c55"} Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.641173 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" event={"ID":"81f879fe-7bd1-42d0-b026-80f901641a0b","Type":"ContainerDied","Data":"71eeeb2236ea109e4995422167d6b6185d64b78a4f394944d8af1d30f1eaa147"} Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.641223 4804 scope.go:117] "RemoveContainer" containerID="50587ee39943ecdd522f24486a8c7642f0b337a62b3b45381bc4dd36af224c55" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.642024 4804 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.642547 4804 status_manager.go:851] "Failed to get status for pod" podUID="1c7ffc91-beb4-48c9-bd6a-3432eb40cb18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.642856 4804 status_manager.go:851] "Failed to get status for pod" podUID="81f879fe-7bd1-42d0-b026-80f901641a0b" pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bstw9\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.666048 4804 scope.go:117] "RemoveContainer" containerID="50587ee39943ecdd522f24486a8c7642f0b337a62b3b45381bc4dd36af224c55" Feb 17 13:29:33 crc kubenswrapper[4804]: E0217 13:29:33.666453 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50587ee39943ecdd522f24486a8c7642f0b337a62b3b45381bc4dd36af224c55\": container with ID starting with 50587ee39943ecdd522f24486a8c7642f0b337a62b3b45381bc4dd36af224c55 not found: ID does not exist" containerID="50587ee39943ecdd522f24486a8c7642f0b337a62b3b45381bc4dd36af224c55" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.666484 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50587ee39943ecdd522f24486a8c7642f0b337a62b3b45381bc4dd36af224c55"} err="failed to get container status \"50587ee39943ecdd522f24486a8c7642f0b337a62b3b45381bc4dd36af224c55\": rpc error: code = NotFound desc = could not find container \"50587ee39943ecdd522f24486a8c7642f0b337a62b3b45381bc4dd36af224c55\": container with ID starting with 50587ee39943ecdd522f24486a8c7642f0b337a62b3b45381bc4dd36af224c55 not found: ID does not exist" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.671946 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfbln\" (UniqueName: \"kubernetes.io/projected/81f879fe-7bd1-42d0-b026-80f901641a0b-kube-api-access-vfbln\") pod \"81f879fe-7bd1-42d0-b026-80f901641a0b\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.672008 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-session\") pod \"81f879fe-7bd1-42d0-b026-80f901641a0b\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.672029 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-cliconfig\") pod \"81f879fe-7bd1-42d0-b026-80f901641a0b\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.672045 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-audit-policies\") pod \"81f879fe-7bd1-42d0-b026-80f901641a0b\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.672068 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-serving-cert\") pod \"81f879fe-7bd1-42d0-b026-80f901641a0b\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.672096 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-router-certs\") pod \"81f879fe-7bd1-42d0-b026-80f901641a0b\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.672120 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/81f879fe-7bd1-42d0-b026-80f901641a0b-audit-dir\") pod \"81f879fe-7bd1-42d0-b026-80f901641a0b\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.672139 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-template-provider-selection\") pod \"81f879fe-7bd1-42d0-b026-80f901641a0b\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.672155 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-template-login\") pod \"81f879fe-7bd1-42d0-b026-80f901641a0b\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.672174 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-template-error\") pod \"81f879fe-7bd1-42d0-b026-80f901641a0b\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.672225 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-service-ca\") pod \"81f879fe-7bd1-42d0-b026-80f901641a0b\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.672243 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-trusted-ca-bundle\") pod \"81f879fe-7bd1-42d0-b026-80f901641a0b\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.672266 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-idp-0-file-data\") pod \"81f879fe-7bd1-42d0-b026-80f901641a0b\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.672295 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-ocp-branding-template\") pod \"81f879fe-7bd1-42d0-b026-80f901641a0b\" (UID: \"81f879fe-7bd1-42d0-b026-80f901641a0b\") " Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.673636 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/81f879fe-7bd1-42d0-b026-80f901641a0b-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "81f879fe-7bd1-42d0-b026-80f901641a0b" (UID: "81f879fe-7bd1-42d0-b026-80f901641a0b"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.675378 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "81f879fe-7bd1-42d0-b026-80f901641a0b" (UID: "81f879fe-7bd1-42d0-b026-80f901641a0b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.675579 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "81f879fe-7bd1-42d0-b026-80f901641a0b" (UID: "81f879fe-7bd1-42d0-b026-80f901641a0b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.676441 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "81f879fe-7bd1-42d0-b026-80f901641a0b" (UID: "81f879fe-7bd1-42d0-b026-80f901641a0b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.676586 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "81f879fe-7bd1-42d0-b026-80f901641a0b" (UID: "81f879fe-7bd1-42d0-b026-80f901641a0b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.679016 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "81f879fe-7bd1-42d0-b026-80f901641a0b" (UID: "81f879fe-7bd1-42d0-b026-80f901641a0b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.679487 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81f879fe-7bd1-42d0-b026-80f901641a0b-kube-api-access-vfbln" (OuterVolumeSpecName: "kube-api-access-vfbln") pod "81f879fe-7bd1-42d0-b026-80f901641a0b" (UID: "81f879fe-7bd1-42d0-b026-80f901641a0b"). InnerVolumeSpecName "kube-api-access-vfbln". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.679571 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "81f879fe-7bd1-42d0-b026-80f901641a0b" (UID: "81f879fe-7bd1-42d0-b026-80f901641a0b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.679840 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "81f879fe-7bd1-42d0-b026-80f901641a0b" (UID: "81f879fe-7bd1-42d0-b026-80f901641a0b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.680025 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "81f879fe-7bd1-42d0-b026-80f901641a0b" (UID: "81f879fe-7bd1-42d0-b026-80f901641a0b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.680313 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "81f879fe-7bd1-42d0-b026-80f901641a0b" (UID: "81f879fe-7bd1-42d0-b026-80f901641a0b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.681667 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "81f879fe-7bd1-42d0-b026-80f901641a0b" (UID: "81f879fe-7bd1-42d0-b026-80f901641a0b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.683150 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "81f879fe-7bd1-42d0-b026-80f901641a0b" (UID: "81f879fe-7bd1-42d0-b026-80f901641a0b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.685645 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "81f879fe-7bd1-42d0-b026-80f901641a0b" (UID: "81f879fe-7bd1-42d0-b026-80f901641a0b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.774845 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.774882 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.774897 4804 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.774911 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.774923 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.774943 4804 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/81f879fe-7bd1-42d0-b026-80f901641a0b-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.774956 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.774970 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.774983 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.774994 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.775005 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.775019 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.775031 4804 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/81f879fe-7bd1-42d0-b026-80f901641a0b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.775043 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfbln\" (UniqueName: \"kubernetes.io/projected/81f879fe-7bd1-42d0-b026-80f901641a0b-kube-api-access-vfbln\") on node \"crc\" DevicePath \"\"" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.962824 4804 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.963352 4804 status_manager.go:851] "Failed to get status for pod" podUID="1c7ffc91-beb4-48c9-bd6a-3432eb40cb18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:33 crc kubenswrapper[4804]: I0217 13:29:33.963983 4804 status_manager.go:851] "Failed to get status for pod" podUID="81f879fe-7bd1-42d0-b026-80f901641a0b" pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bstw9\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:36 crc kubenswrapper[4804]: I0217 13:29:36.577072 4804 status_manager.go:851] "Failed to get status for pod" podUID="81f879fe-7bd1-42d0-b026-80f901641a0b" pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bstw9\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:36 crc kubenswrapper[4804]: I0217 13:29:36.578070 4804 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:36 crc kubenswrapper[4804]: I0217 13:29:36.578736 4804 status_manager.go:851] "Failed to get status for pod" podUID="1c7ffc91-beb4-48c9-bd6a-3432eb40cb18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:36 crc kubenswrapper[4804]: E0217 13:29:36.822115 4804 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.146:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18950bc4c85f2707 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 13:29:28.243332871 +0000 UTC m=+242.354752198,LastTimestamp:2026-02-17 13:29:28.243332871 +0000 UTC m=+242.354752198,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 13:29:38 crc kubenswrapper[4804]: E0217 13:29:38.105707 4804 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:38 crc kubenswrapper[4804]: E0217 13:29:38.107172 4804 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:38 crc kubenswrapper[4804]: E0217 13:29:38.107718 4804 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:38 crc kubenswrapper[4804]: E0217 13:29:38.108025 4804 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:38 crc kubenswrapper[4804]: E0217 13:29:38.108245 4804 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:38 crc kubenswrapper[4804]: I0217 13:29:38.108268 4804 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 17 13:29:38 crc kubenswrapper[4804]: E0217 13:29:38.108504 4804 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="200ms" Feb 17 13:29:38 crc kubenswrapper[4804]: E0217 13:29:38.309894 4804 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="400ms" Feb 17 13:29:38 crc kubenswrapper[4804]: E0217 13:29:38.710230 4804 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="800ms" Feb 17 13:29:39 crc kubenswrapper[4804]: E0217 13:29:39.511681 4804 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="1.6s" Feb 17 13:29:39 crc kubenswrapper[4804]: I0217 13:29:39.573868 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:39 crc kubenswrapper[4804]: I0217 13:29:39.575344 4804 status_manager.go:851] "Failed to get status for pod" podUID="81f879fe-7bd1-42d0-b026-80f901641a0b" pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bstw9\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:39 crc kubenswrapper[4804]: I0217 13:29:39.577881 4804 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:39 crc kubenswrapper[4804]: I0217 13:29:39.578733 4804 status_manager.go:851] "Failed to get status for pod" podUID="1c7ffc91-beb4-48c9-bd6a-3432eb40cb18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:39 crc kubenswrapper[4804]: I0217 13:29:39.597981 4804 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5cc0810-c040-4cf1-a739-fcd9be2be222" Feb 17 13:29:39 crc kubenswrapper[4804]: I0217 13:29:39.598293 4804 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5cc0810-c040-4cf1-a739-fcd9be2be222" Feb 17 13:29:39 crc kubenswrapper[4804]: E0217 13:29:39.598849 4804 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:39 crc kubenswrapper[4804]: I0217 13:29:39.599675 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:39 crc kubenswrapper[4804]: I0217 13:29:39.692657 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0b0854f3cb262fa049028621e272340489efe82cfa1fc6f2537c58dd46546101"} Feb 17 13:29:39 crc kubenswrapper[4804]: E0217 13:29:39.770553 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:29:39Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:29:39Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:29:39Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T13:29:39Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:77c09c30acdeaaf95ab463052841d32404d264d7b46bead6207afe51848d25e3\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:b7b252dee7cfed79b278bcdec32ab88d70e98e83e6c0db9565a87d9e962cfecb\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1701350082},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:14398311b101163ddd1de78c093e161c5d3c9aac51a04e3d3d842fca6317ab0f\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:5a091792b99bf4dfaec25f4c8e29da579e2f452d48b924c8323a18accb7f3290\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1234637517},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:ad77d0ead8abca8b884fad3be18215dbe8b4f8f098053551e4a899298cf5c918\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:b5338e2ca87e0b47fec93f55559f0ed6b39eef3ed3b7f085a4f0b205ccb86a5d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1213306565},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:28df36269fc553eb1adba5566d6dfc258a1a74063c4cfe8b5bdd3f202591cf56\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:7fa59a55753e6c646b3b56a1a7080a5d70767fb964f1857c411fdf4e05ad4c71\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1201887930},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:39 crc kubenswrapper[4804]: E0217 13:29:39.771289 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:39 crc kubenswrapper[4804]: E0217 13:29:39.772271 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:39 crc kubenswrapper[4804]: E0217 13:29:39.772548 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:39 crc kubenswrapper[4804]: E0217 13:29:39.772916 4804 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:39 crc kubenswrapper[4804]: E0217 13:29:39.772945 4804 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 13:29:40 crc kubenswrapper[4804]: I0217 13:29:40.703160 4804 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="cbe7ae4cd093b968d7c7ee4362aff22a5455f5638fd4398319c0aff8fa79ea7a" exitCode=0 Feb 17 13:29:40 crc kubenswrapper[4804]: I0217 13:29:40.703538 4804 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5cc0810-c040-4cf1-a739-fcd9be2be222" Feb 17 13:29:40 crc kubenswrapper[4804]: I0217 13:29:40.703569 4804 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5cc0810-c040-4cf1-a739-fcd9be2be222" Feb 17 13:29:40 crc kubenswrapper[4804]: I0217 13:29:40.703289 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"cbe7ae4cd093b968d7c7ee4362aff22a5455f5638fd4398319c0aff8fa79ea7a"} Feb 17 13:29:40 crc kubenswrapper[4804]: E0217 13:29:40.704236 4804 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:40 crc kubenswrapper[4804]: I0217 13:29:40.705073 4804 status_manager.go:851] "Failed to get status for pod" podUID="1c7ffc91-beb4-48c9-bd6a-3432eb40cb18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:40 crc kubenswrapper[4804]: I0217 13:29:40.705641 4804 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:40 crc kubenswrapper[4804]: I0217 13:29:40.706085 4804 status_manager.go:851] "Failed to get status for pod" podUID="81f879fe-7bd1-42d0-b026-80f901641a0b" pod="openshift-authentication/oauth-openshift-558db77b4-bstw9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bstw9\": dial tcp 38.102.83.146:6443: connect: connection refused" Feb 17 13:29:41 crc kubenswrapper[4804]: I0217 13:29:41.722343 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"102e1e5974425fef155110e41c51470f9e6b807e0b092d863f31de5f50f21dc1"} Feb 17 13:29:41 crc kubenswrapper[4804]: I0217 13:29:41.722719 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6f50029b23d691f8caa47a0fed4b5fc863c7c5fa284179b0e693e264a1499732"} Feb 17 13:29:41 crc kubenswrapper[4804]: I0217 13:29:41.722736 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"332638fe7f4d285f3941d06f5c458e325092eefe1222a3dd152246b25f4b6cf5"} Feb 17 13:29:41 crc kubenswrapper[4804]: I0217 13:29:41.722748 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e9d83018121b03cf9cc210107a95e53990a1917656e8639c6978b83c786f2589"} Feb 17 13:29:42 crc kubenswrapper[4804]: I0217 13:29:42.639684 4804 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": read tcp 192.168.126.11:54174->192.168.126.11:10257: read: connection reset by peer" start-of-body= Feb 17 13:29:42 crc kubenswrapper[4804]: I0217 13:29:42.639746 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": read tcp 192.168.126.11:54174->192.168.126.11:10257: read: connection reset by peer" Feb 17 13:29:42 crc kubenswrapper[4804]: I0217 13:29:42.731678 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7b0cf48a61f4cdf55a6c1b632af4332f979390f1f9d77984648db8043cd05f9f"} Feb 17 13:29:42 crc kubenswrapper[4804]: I0217 13:29:42.732276 4804 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5cc0810-c040-4cf1-a739-fcd9be2be222" Feb 17 13:29:42 crc kubenswrapper[4804]: I0217 13:29:42.732324 4804 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5cc0810-c040-4cf1-a739-fcd9be2be222" Feb 17 13:29:42 crc kubenswrapper[4804]: I0217 13:29:42.734773 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 17 13:29:42 crc kubenswrapper[4804]: I0217 13:29:42.734850 4804 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d" exitCode=1 Feb 17 13:29:42 crc kubenswrapper[4804]: I0217 13:29:42.734905 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d"} Feb 17 13:29:42 crc kubenswrapper[4804]: I0217 13:29:42.735509 4804 scope.go:117] "RemoveContainer" containerID="c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d" Feb 17 13:29:43 crc kubenswrapper[4804]: I0217 13:29:43.746190 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 17 13:29:43 crc kubenswrapper[4804]: I0217 13:29:43.746756 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b52548660c4b8e92ad39e3e40e26c3218848efb2b6171a343cd6cc6914a3928c"} Feb 17 13:29:43 crc kubenswrapper[4804]: I0217 13:29:43.785474 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:29:44 crc kubenswrapper[4804]: I0217 13:29:44.600237 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:44 crc kubenswrapper[4804]: I0217 13:29:44.600332 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:44 crc kubenswrapper[4804]: I0217 13:29:44.610090 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:47 crc kubenswrapper[4804]: I0217 13:29:47.741842 4804 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:47 crc kubenswrapper[4804]: I0217 13:29:47.769132 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:47 crc kubenswrapper[4804]: I0217 13:29:47.769183 4804 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5cc0810-c040-4cf1-a739-fcd9be2be222" Feb 17 13:29:47 crc kubenswrapper[4804]: I0217 13:29:47.769236 4804 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5cc0810-c040-4cf1-a739-fcd9be2be222" Feb 17 13:29:47 crc kubenswrapper[4804]: I0217 13:29:47.775477 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:29:47 crc kubenswrapper[4804]: I0217 13:29:47.810908 4804 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="ab993b1e-b5d6-4960-87de-575a7efa0fa6" Feb 17 13:29:48 crc kubenswrapper[4804]: I0217 13:29:48.773037 4804 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5cc0810-c040-4cf1-a739-fcd9be2be222" Feb 17 13:29:48 crc kubenswrapper[4804]: I0217 13:29:48.773413 4804 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5cc0810-c040-4cf1-a739-fcd9be2be222" Feb 17 13:29:48 crc kubenswrapper[4804]: I0217 13:29:48.777001 4804 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="ab993b1e-b5d6-4960-87de-575a7efa0fa6" Feb 17 13:29:48 crc kubenswrapper[4804]: I0217 13:29:48.846641 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:29:48 crc kubenswrapper[4804]: I0217 13:29:48.846956 4804 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 17 13:29:48 crc kubenswrapper[4804]: I0217 13:29:48.846992 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 17 13:29:49 crc kubenswrapper[4804]: I0217 13:29:49.777840 4804 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5cc0810-c040-4cf1-a739-fcd9be2be222" Feb 17 13:29:49 crc kubenswrapper[4804]: I0217 13:29:49.777882 4804 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5cc0810-c040-4cf1-a739-fcd9be2be222" Feb 17 13:29:49 crc kubenswrapper[4804]: I0217 13:29:49.786869 4804 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="ab993b1e-b5d6-4960-87de-575a7efa0fa6" Feb 17 13:29:57 crc kubenswrapper[4804]: I0217 13:29:57.378422 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 17 13:29:58 crc kubenswrapper[4804]: I0217 13:29:58.490355 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 17 13:29:58 crc kubenswrapper[4804]: I0217 13:29:58.791494 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 17 13:29:58 crc kubenswrapper[4804]: I0217 13:29:58.847585 4804 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 17 13:29:58 crc kubenswrapper[4804]: I0217 13:29:58.847980 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 17 13:29:59 crc kubenswrapper[4804]: I0217 13:29:59.849568 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 17 13:29:59 crc kubenswrapper[4804]: I0217 13:29:59.945284 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 17 13:30:00 crc kubenswrapper[4804]: I0217 13:30:00.108008 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 17 13:30:00 crc kubenswrapper[4804]: I0217 13:30:00.141701 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 17 13:30:00 crc kubenswrapper[4804]: I0217 13:30:00.174042 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 17 13:30:00 crc kubenswrapper[4804]: I0217 13:30:00.197077 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 17 13:30:00 crc kubenswrapper[4804]: I0217 13:30:00.342025 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 17 13:30:00 crc kubenswrapper[4804]: I0217 13:30:00.434701 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 13:30:00 crc kubenswrapper[4804]: I0217 13:30:00.512278 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 17 13:30:00 crc kubenswrapper[4804]: I0217 13:30:00.522924 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 17 13:30:00 crc kubenswrapper[4804]: I0217 13:30:00.642743 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 17 13:30:00 crc kubenswrapper[4804]: I0217 13:30:00.702516 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 17 13:30:00 crc kubenswrapper[4804]: I0217 13:30:00.907736 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 17 13:30:01 crc kubenswrapper[4804]: I0217 13:30:01.141110 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 17 13:30:01 crc kubenswrapper[4804]: I0217 13:30:01.179796 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 17 13:30:01 crc kubenswrapper[4804]: I0217 13:30:01.551801 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 17 13:30:01 crc kubenswrapper[4804]: I0217 13:30:01.597412 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 17 13:30:01 crc kubenswrapper[4804]: I0217 13:30:01.697888 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 17 13:30:01 crc kubenswrapper[4804]: I0217 13:30:01.754983 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 17 13:30:01 crc kubenswrapper[4804]: I0217 13:30:01.772441 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 17 13:30:01 crc kubenswrapper[4804]: I0217 13:30:01.772807 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 17 13:30:01 crc kubenswrapper[4804]: I0217 13:30:01.776892 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 17 13:30:01 crc kubenswrapper[4804]: I0217 13:30:01.894846 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 17 13:30:01 crc kubenswrapper[4804]: I0217 13:30:01.959927 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 17 13:30:01 crc kubenswrapper[4804]: I0217 13:30:01.961614 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 17 13:30:01 crc kubenswrapper[4804]: I0217 13:30:01.977177 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 17 13:30:02 crc kubenswrapper[4804]: I0217 13:30:02.010723 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 17 13:30:02 crc kubenswrapper[4804]: I0217 13:30:02.044109 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 17 13:30:02 crc kubenswrapper[4804]: I0217 13:30:02.096400 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 17 13:30:02 crc kubenswrapper[4804]: I0217 13:30:02.110864 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 17 13:30:02 crc kubenswrapper[4804]: I0217 13:30:02.357316 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 17 13:30:02 crc kubenswrapper[4804]: I0217 13:30:02.381656 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 17 13:30:02 crc kubenswrapper[4804]: I0217 13:30:02.399946 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 13:30:02 crc kubenswrapper[4804]: I0217 13:30:02.447085 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 17 13:30:02 crc kubenswrapper[4804]: I0217 13:30:02.484219 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 17 13:30:02 crc kubenswrapper[4804]: I0217 13:30:02.487466 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 17 13:30:02 crc kubenswrapper[4804]: I0217 13:30:02.684116 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 17 13:30:03 crc kubenswrapper[4804]: I0217 13:30:03.005190 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 17 13:30:03 crc kubenswrapper[4804]: I0217 13:30:03.009326 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 17 13:30:03 crc kubenswrapper[4804]: I0217 13:30:03.090376 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 17 13:30:03 crc kubenswrapper[4804]: I0217 13:30:03.130909 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 17 13:30:03 crc kubenswrapper[4804]: I0217 13:30:03.379932 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 17 13:30:03 crc kubenswrapper[4804]: I0217 13:30:03.442498 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 17 13:30:03 crc kubenswrapper[4804]: I0217 13:30:03.462113 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 17 13:30:03 crc kubenswrapper[4804]: I0217 13:30:03.483579 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 17 13:30:03 crc kubenswrapper[4804]: I0217 13:30:03.492300 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 17 13:30:03 crc kubenswrapper[4804]: I0217 13:30:03.570394 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 17 13:30:03 crc kubenswrapper[4804]: I0217 13:30:03.679714 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 17 13:30:03 crc kubenswrapper[4804]: I0217 13:30:03.839126 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 17 13:30:03 crc kubenswrapper[4804]: I0217 13:30:03.867156 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 13:30:03 crc kubenswrapper[4804]: I0217 13:30:03.875693 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 13:30:03 crc kubenswrapper[4804]: I0217 13:30:03.917696 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 17 13:30:03 crc kubenswrapper[4804]: I0217 13:30:03.932649 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 17 13:30:03 crc kubenswrapper[4804]: I0217 13:30:03.950487 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 17 13:30:03 crc kubenswrapper[4804]: I0217 13:30:03.956655 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 17 13:30:03 crc kubenswrapper[4804]: I0217 13:30:03.969078 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 17 13:30:04 crc kubenswrapper[4804]: I0217 13:30:04.025188 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 17 13:30:04 crc kubenswrapper[4804]: I0217 13:30:04.058427 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 17 13:30:04 crc kubenswrapper[4804]: I0217 13:30:04.071421 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 17 13:30:04 crc kubenswrapper[4804]: I0217 13:30:04.154907 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 17 13:30:04 crc kubenswrapper[4804]: I0217 13:30:04.418600 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 17 13:30:04 crc kubenswrapper[4804]: I0217 13:30:04.423451 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 17 13:30:04 crc kubenswrapper[4804]: I0217 13:30:04.428612 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 13:30:04 crc kubenswrapper[4804]: I0217 13:30:04.530844 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 17 13:30:04 crc kubenswrapper[4804]: I0217 13:30:04.634019 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 17 13:30:04 crc kubenswrapper[4804]: I0217 13:30:04.664173 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 17 13:30:04 crc kubenswrapper[4804]: I0217 13:30:04.678658 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 17 13:30:04 crc kubenswrapper[4804]: I0217 13:30:04.686270 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 17 13:30:04 crc kubenswrapper[4804]: I0217 13:30:04.697304 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 17 13:30:04 crc kubenswrapper[4804]: I0217 13:30:04.833952 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 13:30:04 crc kubenswrapper[4804]: I0217 13:30:04.927910 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 17 13:30:05 crc kubenswrapper[4804]: I0217 13:30:05.065023 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 17 13:30:05 crc kubenswrapper[4804]: I0217 13:30:05.072943 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 17 13:30:05 crc kubenswrapper[4804]: I0217 13:30:05.094598 4804 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 17 13:30:05 crc kubenswrapper[4804]: I0217 13:30:05.099057 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 17 13:30:05 crc kubenswrapper[4804]: I0217 13:30:05.168171 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 17 13:30:05 crc kubenswrapper[4804]: I0217 13:30:05.169760 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 17 13:30:05 crc kubenswrapper[4804]: I0217 13:30:05.192811 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 17 13:30:05 crc kubenswrapper[4804]: I0217 13:30:05.226716 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 17 13:30:05 crc kubenswrapper[4804]: I0217 13:30:05.258693 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 17 13:30:05 crc kubenswrapper[4804]: I0217 13:30:05.337202 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 17 13:30:05 crc kubenswrapper[4804]: I0217 13:30:05.548707 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 17 13:30:05 crc kubenswrapper[4804]: I0217 13:30:05.736271 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 17 13:30:05 crc kubenswrapper[4804]: I0217 13:30:05.891848 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 13:30:05 crc kubenswrapper[4804]: I0217 13:30:05.898107 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 17 13:30:06 crc kubenswrapper[4804]: I0217 13:30:06.065551 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 17 13:30:06 crc kubenswrapper[4804]: I0217 13:30:06.233271 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 17 13:30:06 crc kubenswrapper[4804]: I0217 13:30:06.257361 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 17 13:30:06 crc kubenswrapper[4804]: I0217 13:30:06.291986 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 17 13:30:06 crc kubenswrapper[4804]: I0217 13:30:06.314229 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 17 13:30:06 crc kubenswrapper[4804]: I0217 13:30:06.337618 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 17 13:30:06 crc kubenswrapper[4804]: I0217 13:30:06.351276 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 17 13:30:06 crc kubenswrapper[4804]: I0217 13:30:06.359033 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 17 13:30:06 crc kubenswrapper[4804]: I0217 13:30:06.451064 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 17 13:30:06 crc kubenswrapper[4804]: I0217 13:30:06.452974 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 17 13:30:06 crc kubenswrapper[4804]: I0217 13:30:06.457467 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 17 13:30:06 crc kubenswrapper[4804]: I0217 13:30:06.474151 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 17 13:30:06 crc kubenswrapper[4804]: I0217 13:30:06.519707 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 17 13:30:06 crc kubenswrapper[4804]: I0217 13:30:06.620523 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 17 13:30:06 crc kubenswrapper[4804]: I0217 13:30:06.650947 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 17 13:30:06 crc kubenswrapper[4804]: I0217 13:30:06.780272 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 17 13:30:06 crc kubenswrapper[4804]: I0217 13:30:06.824890 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 17 13:30:06 crc kubenswrapper[4804]: I0217 13:30:06.855760 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 17 13:30:07 crc kubenswrapper[4804]: I0217 13:30:07.040412 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 17 13:30:07 crc kubenswrapper[4804]: I0217 13:30:07.044490 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 17 13:30:07 crc kubenswrapper[4804]: I0217 13:30:07.059620 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 17 13:30:07 crc kubenswrapper[4804]: I0217 13:30:07.102838 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 17 13:30:07 crc kubenswrapper[4804]: I0217 13:30:07.139829 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 17 13:30:07 crc kubenswrapper[4804]: I0217 13:30:07.211668 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 17 13:30:07 crc kubenswrapper[4804]: I0217 13:30:07.370111 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 17 13:30:07 crc kubenswrapper[4804]: I0217 13:30:07.432642 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 17 13:30:07 crc kubenswrapper[4804]: I0217 13:30:07.454959 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 17 13:30:07 crc kubenswrapper[4804]: I0217 13:30:07.493943 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 17 13:30:07 crc kubenswrapper[4804]: I0217 13:30:07.588768 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 17 13:30:07 crc kubenswrapper[4804]: I0217 13:30:07.602888 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 17 13:30:07 crc kubenswrapper[4804]: I0217 13:30:07.606284 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 17 13:30:07 crc kubenswrapper[4804]: I0217 13:30:07.636847 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 17 13:30:07 crc kubenswrapper[4804]: I0217 13:30:07.653302 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 17 13:30:07 crc kubenswrapper[4804]: I0217 13:30:07.821182 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 17 13:30:07 crc kubenswrapper[4804]: I0217 13:30:07.828266 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 17 13:30:07 crc kubenswrapper[4804]: I0217 13:30:07.944790 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 17 13:30:07 crc kubenswrapper[4804]: I0217 13:30:07.994521 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.028491 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.029852 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.041075 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.057883 4804 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.058307 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=41.058285014 podStartE2EDuration="41.058285014s" podCreationTimestamp="2026-02-17 13:29:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:29:47.784780741 +0000 UTC m=+261.896200098" watchObservedRunningTime="2026-02-17 13:30:08.058285014 +0000 UTC m=+282.169704391" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.066948 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bstw9","openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.067050 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.075439 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.095566 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=21.095550119 podStartE2EDuration="21.095550119s" podCreationTimestamp="2026-02-17 13:29:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:30:08.09045004 +0000 UTC m=+282.201869387" watchObservedRunningTime="2026-02-17 13:30:08.095550119 +0000 UTC m=+282.206969466" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.185111 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.317681 4804 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.352513 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.369087 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.432696 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.479764 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.522617 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.526443 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.529473 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.551329 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6ffb55868c-gf9sh"] Feb 17 13:30:08 crc kubenswrapper[4804]: E0217 13:30:08.551572 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81f879fe-7bd1-42d0-b026-80f901641a0b" containerName="oauth-openshift" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.551585 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="81f879fe-7bd1-42d0-b026-80f901641a0b" containerName="oauth-openshift" Feb 17 13:30:08 crc kubenswrapper[4804]: E0217 13:30:08.551609 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c7ffc91-beb4-48c9-bd6a-3432eb40cb18" containerName="installer" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.551617 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c7ffc91-beb4-48c9-bd6a-3432eb40cb18" containerName="installer" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.551729 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c7ffc91-beb4-48c9-bd6a-3432eb40cb18" containerName="installer" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.551743 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="81f879fe-7bd1-42d0-b026-80f901641a0b" containerName="oauth-openshift" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.552296 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.554562 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.554808 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.555128 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.555456 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.557257 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.557436 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.557719 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.557924 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.558796 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.560350 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.560482 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.561598 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.567062 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.567721 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.572128 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.585395 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81f879fe-7bd1-42d0-b026-80f901641a0b" path="/var/lib/kubelet/pods/81f879fe-7bd1-42d0-b026-80f901641a0b/volumes" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.610652 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.614247 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.649344 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.691485 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.708617 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.713992 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.714154 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-user-template-login\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.714229 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-session\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.714267 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-router-certs\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.714307 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8dbf5bc9-5a1a-4946-823d-1da911581f59-audit-policies\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.714367 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.714401 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrxn5\" (UniqueName: \"kubernetes.io/projected/8dbf5bc9-5a1a-4946-823d-1da911581f59-kube-api-access-mrxn5\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.714470 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.714636 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.714792 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-user-template-error\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.714822 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.714847 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.714882 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8dbf5bc9-5a1a-4946-823d-1da911581f59-audit-dir\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.714909 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-service-ca\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.775188 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.782470 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.785701 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.816010 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.816077 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrxn5\" (UniqueName: \"kubernetes.io/projected/8dbf5bc9-5a1a-4946-823d-1da911581f59-kube-api-access-mrxn5\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.816135 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.816177 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.816259 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-user-template-error\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.816293 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.816327 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.816375 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8dbf5bc9-5a1a-4946-823d-1da911581f59-audit-dir\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.816411 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-service-ca\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.816446 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.816515 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-user-template-login\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.816553 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-session\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.816599 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-router-certs\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.816638 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8dbf5bc9-5a1a-4946-823d-1da911581f59-audit-policies\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.817390 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8dbf5bc9-5a1a-4946-823d-1da911581f59-audit-dir\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.817873 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8dbf5bc9-5a1a-4946-823d-1da911581f59-audit-policies\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.818155 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-service-ca\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.818545 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.820552 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.825807 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.825840 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-user-template-error\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.826041 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-user-template-login\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.826631 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-session\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.827239 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-router-certs\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.828793 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.830781 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.837118 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8dbf5bc9-5a1a-4946-823d-1da911581f59-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.839361 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.847158 4804 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.847264 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.847323 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.848073 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrxn5\" (UniqueName: \"kubernetes.io/projected/8dbf5bc9-5a1a-4946-823d-1da911581f59-kube-api-access-mrxn5\") pod \"oauth-openshift-6ffb55868c-gf9sh\" (UID: \"8dbf5bc9-5a1a-4946-823d-1da911581f59\") " pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.849370 4804 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"b52548660c4b8e92ad39e3e40e26c3218848efb2b6171a343cd6cc6914a3928c"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.850290 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://b52548660c4b8e92ad39e3e40e26c3218848efb2b6171a343cd6cc6914a3928c" gracePeriod=30 Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.873666 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 17 13:30:08 crc kubenswrapper[4804]: I0217 13:30:08.874839 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:08.890290 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:08.890847 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:08.956689 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:08.973471 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:08.995906 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:09.008504 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:09.055570 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:09.120769 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:09.121765 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:09.125871 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:09.219098 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:09.226138 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:09.336452 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:09.354952 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:09.402013 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:09.431039 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:09.571323 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:09.648278 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:09.771678 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:09.825786 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:09.874978 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:09.878529 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:09.905158 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:09.917299 4804 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 17 13:30:09 crc kubenswrapper[4804]: I0217 13:30:09.950834 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 17 13:30:10 crc kubenswrapper[4804]: I0217 13:30:10.027763 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 17 13:30:10 crc kubenswrapper[4804]: I0217 13:30:10.121835 4804 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 13:30:10 crc kubenswrapper[4804]: I0217 13:30:10.122051 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://2b3550f4627e08b0ac9b2c3b416d986c65ee31913e0c64bd49e44e832ba5e1bf" gracePeriod=5 Feb 17 13:30:10 crc kubenswrapper[4804]: I0217 13:30:10.273168 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 17 13:30:10 crc kubenswrapper[4804]: I0217 13:30:10.282786 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 17 13:30:10 crc kubenswrapper[4804]: I0217 13:30:10.367282 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 17 13:30:10 crc kubenswrapper[4804]: I0217 13:30:10.531645 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 17 13:30:10 crc kubenswrapper[4804]: I0217 13:30:10.558307 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 13:30:10 crc kubenswrapper[4804]: I0217 13:30:10.593196 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 17 13:30:10 crc kubenswrapper[4804]: I0217 13:30:10.604629 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 17 13:30:10 crc kubenswrapper[4804]: I0217 13:30:10.722192 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 13:30:10 crc kubenswrapper[4804]: I0217 13:30:10.737260 4804 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 17 13:30:10 crc kubenswrapper[4804]: I0217 13:30:10.765553 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 17 13:30:10 crc kubenswrapper[4804]: I0217 13:30:10.867960 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 17 13:30:10 crc kubenswrapper[4804]: I0217 13:30:10.872938 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 17 13:30:10 crc kubenswrapper[4804]: I0217 13:30:10.877415 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 17 13:30:11 crc kubenswrapper[4804]: I0217 13:30:11.105947 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 17 13:30:11 crc kubenswrapper[4804]: I0217 13:30:11.184942 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 17 13:30:11 crc kubenswrapper[4804]: I0217 13:30:11.393216 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 17 13:30:11 crc kubenswrapper[4804]: I0217 13:30:11.600048 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 17 13:30:11 crc kubenswrapper[4804]: I0217 13:30:11.679319 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 17 13:30:11 crc kubenswrapper[4804]: I0217 13:30:11.829877 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 17 13:30:11 crc kubenswrapper[4804]: I0217 13:30:11.860382 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 17 13:30:11 crc kubenswrapper[4804]: I0217 13:30:11.872125 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6ffb55868c-gf9sh"] Feb 17 13:30:11 crc kubenswrapper[4804]: I0217 13:30:11.914682 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 17 13:30:11 crc kubenswrapper[4804]: I0217 13:30:11.936126 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 17 13:30:12 crc kubenswrapper[4804]: I0217 13:30:12.014318 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 17 13:30:12 crc kubenswrapper[4804]: I0217 13:30:12.066579 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 17 13:30:12 crc kubenswrapper[4804]: I0217 13:30:12.223057 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 17 13:30:12 crc kubenswrapper[4804]: I0217 13:30:12.322557 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6ffb55868c-gf9sh"] Feb 17 13:30:12 crc kubenswrapper[4804]: I0217 13:30:12.351128 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 17 13:30:12 crc kubenswrapper[4804]: I0217 13:30:12.396905 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 17 13:30:12 crc kubenswrapper[4804]: I0217 13:30:12.594158 4804 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 17 13:30:12 crc kubenswrapper[4804]: I0217 13:30:12.601258 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 17 13:30:12 crc kubenswrapper[4804]: I0217 13:30:12.636255 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 17 13:30:12 crc kubenswrapper[4804]: I0217 13:30:12.707120 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 13:30:12 crc kubenswrapper[4804]: I0217 13:30:12.792147 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 17 13:30:12 crc kubenswrapper[4804]: I0217 13:30:12.850553 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 17 13:30:12 crc kubenswrapper[4804]: I0217 13:30:12.914140 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" event={"ID":"8dbf5bc9-5a1a-4946-823d-1da911581f59","Type":"ContainerStarted","Data":"6ed01f5efaf7d487a86c0e691215c53eaa541deaf5982f335c0080bcbfa88b4f"} Feb 17 13:30:12 crc kubenswrapper[4804]: I0217 13:30:12.914190 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" event={"ID":"8dbf5bc9-5a1a-4946-823d-1da911581f59","Type":"ContainerStarted","Data":"f4723ca020f66f609ab934eb8c1e53d0c21b380d3f685f97a1d7387e1fbed0ba"} Feb 17 13:30:12 crc kubenswrapper[4804]: I0217 13:30:12.914429 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:12 crc kubenswrapper[4804]: I0217 13:30:12.915329 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 17 13:30:12 crc kubenswrapper[4804]: I0217 13:30:12.919873 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" Feb 17 13:30:12 crc kubenswrapper[4804]: I0217 13:30:12.933979 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6ffb55868c-gf9sh" podStartSLOduration=64.933964372 podStartE2EDuration="1m4.933964372s" podCreationTimestamp="2026-02-17 13:29:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:30:12.931755478 +0000 UTC m=+287.043174835" watchObservedRunningTime="2026-02-17 13:30:12.933964372 +0000 UTC m=+287.045383709" Feb 17 13:30:13 crc kubenswrapper[4804]: I0217 13:30:13.025888 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 17 13:30:13 crc kubenswrapper[4804]: I0217 13:30:13.063110 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 17 13:30:13 crc kubenswrapper[4804]: I0217 13:30:13.190707 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 17 13:30:13 crc kubenswrapper[4804]: I0217 13:30:13.248002 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 17 13:30:13 crc kubenswrapper[4804]: I0217 13:30:13.489332 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 17 13:30:13 crc kubenswrapper[4804]: I0217 13:30:13.507252 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 17 13:30:13 crc kubenswrapper[4804]: I0217 13:30:13.534812 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 17 13:30:13 crc kubenswrapper[4804]: I0217 13:30:13.551081 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 17 13:30:13 crc kubenswrapper[4804]: I0217 13:30:13.567510 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 17 13:30:13 crc kubenswrapper[4804]: I0217 13:30:13.654608 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 17 13:30:13 crc kubenswrapper[4804]: I0217 13:30:13.689734 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 17 13:30:13 crc kubenswrapper[4804]: I0217 13:30:13.844513 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 17 13:30:13 crc kubenswrapper[4804]: I0217 13:30:13.918843 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 17 13:30:13 crc kubenswrapper[4804]: I0217 13:30:13.977650 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 17 13:30:14 crc kubenswrapper[4804]: I0217 13:30:14.000366 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 17 13:30:14 crc kubenswrapper[4804]: I0217 13:30:14.160436 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 17 13:30:14 crc kubenswrapper[4804]: I0217 13:30:14.224902 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 17 13:30:14 crc kubenswrapper[4804]: I0217 13:30:14.231126 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 17 13:30:14 crc kubenswrapper[4804]: I0217 13:30:14.247662 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 17 13:30:14 crc kubenswrapper[4804]: I0217 13:30:14.380534 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 13:30:14 crc kubenswrapper[4804]: I0217 13:30:14.500681 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 17 13:30:14 crc kubenswrapper[4804]: I0217 13:30:14.543941 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 17 13:30:14 crc kubenswrapper[4804]: I0217 13:30:14.943980 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.266032 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.336960 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.360346 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.448706 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.802196 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.802916 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.904657 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.904796 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.904852 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.904971 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.905103 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.905155 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.905337 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.905394 4804 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.905351 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.905071 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.913491 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.931961 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.932043 4804 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="2b3550f4627e08b0ac9b2c3b416d986c65ee31913e0c64bd49e44e832ba5e1bf" exitCode=137 Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.932089 4804 scope.go:117] "RemoveContainer" containerID="2b3550f4627e08b0ac9b2c3b416d986c65ee31913e0c64bd49e44e832ba5e1bf" Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.932220 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.959168 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.971035 4804 scope.go:117] "RemoveContainer" containerID="2b3550f4627e08b0ac9b2c3b416d986c65ee31913e0c64bd49e44e832ba5e1bf" Feb 17 13:30:15 crc kubenswrapper[4804]: E0217 13:30:15.971609 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b3550f4627e08b0ac9b2c3b416d986c65ee31913e0c64bd49e44e832ba5e1bf\": container with ID starting with 2b3550f4627e08b0ac9b2c3b416d986c65ee31913e0c64bd49e44e832ba5e1bf not found: ID does not exist" containerID="2b3550f4627e08b0ac9b2c3b416d986c65ee31913e0c64bd49e44e832ba5e1bf" Feb 17 13:30:15 crc kubenswrapper[4804]: I0217 13:30:15.971755 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b3550f4627e08b0ac9b2c3b416d986c65ee31913e0c64bd49e44e832ba5e1bf"} err="failed to get container status \"2b3550f4627e08b0ac9b2c3b416d986c65ee31913e0c64bd49e44e832ba5e1bf\": rpc error: code = NotFound desc = could not find container \"2b3550f4627e08b0ac9b2c3b416d986c65ee31913e0c64bd49e44e832ba5e1bf\": container with ID starting with 2b3550f4627e08b0ac9b2c3b416d986c65ee31913e0c64bd49e44e832ba5e1bf not found: ID does not exist" Feb 17 13:30:16 crc kubenswrapper[4804]: I0217 13:30:16.005998 4804 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 17 13:30:16 crc kubenswrapper[4804]: I0217 13:30:16.006032 4804 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 17 13:30:16 crc kubenswrapper[4804]: I0217 13:30:16.006041 4804 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 17 13:30:16 crc kubenswrapper[4804]: I0217 13:30:16.006050 4804 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 17 13:30:16 crc kubenswrapper[4804]: I0217 13:30:16.218851 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 17 13:30:16 crc kubenswrapper[4804]: I0217 13:30:16.591926 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 17 13:30:16 crc kubenswrapper[4804]: I0217 13:30:16.592187 4804 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Feb 17 13:30:16 crc kubenswrapper[4804]: I0217 13:30:16.609424 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 13:30:16 crc kubenswrapper[4804]: I0217 13:30:16.609485 4804 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="0b339e42-270d-4384-9d64-67edf62c1ad5" Feb 17 13:30:16 crc kubenswrapper[4804]: I0217 13:30:16.610692 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 13:30:16 crc kubenswrapper[4804]: I0217 13:30:16.610741 4804 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="0b339e42-270d-4384-9d64-67edf62c1ad5" Feb 17 13:30:26 crc kubenswrapper[4804]: I0217 13:30:26.373916 4804 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 17 13:30:39 crc kubenswrapper[4804]: I0217 13:30:39.167262 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 17 13:30:39 crc kubenswrapper[4804]: I0217 13:30:39.170947 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 17 13:30:39 crc kubenswrapper[4804]: I0217 13:30:39.171007 4804 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="b52548660c4b8e92ad39e3e40e26c3218848efb2b6171a343cd6cc6914a3928c" exitCode=137 Feb 17 13:30:39 crc kubenswrapper[4804]: I0217 13:30:39.171053 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"b52548660c4b8e92ad39e3e40e26c3218848efb2b6171a343cd6cc6914a3928c"} Feb 17 13:30:39 crc kubenswrapper[4804]: I0217 13:30:39.171093 4804 scope.go:117] "RemoveContainer" containerID="c6214e2e43d615d8ab0f059effb9c46c5b0df3614e054cc760ef938708759c1d" Feb 17 13:30:40 crc kubenswrapper[4804]: I0217 13:30:40.178666 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 17 13:30:40 crc kubenswrapper[4804]: I0217 13:30:40.180286 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"51d848e7fcac5ba8a752f5e5974f1297fda11e24720dd6b2c062443ccf88803d"} Feb 17 13:30:43 crc kubenswrapper[4804]: I0217 13:30:43.786526 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:30:48 crc kubenswrapper[4804]: I0217 13:30:48.846707 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:30:48 crc kubenswrapper[4804]: I0217 13:30:48.850309 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:30:49 crc kubenswrapper[4804]: I0217 13:30:49.233670 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 13:31:00 crc kubenswrapper[4804]: I0217 13:31:00.697652 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9"] Feb 17 13:31:00 crc kubenswrapper[4804]: E0217 13:31:00.698410 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 17 13:31:00 crc kubenswrapper[4804]: I0217 13:31:00.698426 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 17 13:31:00 crc kubenswrapper[4804]: I0217 13:31:00.698544 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 17 13:31:00 crc kubenswrapper[4804]: I0217 13:31:00.698865 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9" Feb 17 13:31:00 crc kubenswrapper[4804]: I0217 13:31:00.702228 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 13:31:00 crc kubenswrapper[4804]: I0217 13:31:00.703621 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 13:31:00 crc kubenswrapper[4804]: I0217 13:31:00.708759 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx798\" (UniqueName: \"kubernetes.io/projected/f9f0ac4b-5b59-4ff9-92ba-54668fffef27-kube-api-access-tx798\") pod \"collect-profiles-29522250-ddtb9\" (UID: \"f9f0ac4b-5b59-4ff9-92ba-54668fffef27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9" Feb 17 13:31:00 crc kubenswrapper[4804]: I0217 13:31:00.708791 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f9f0ac4b-5b59-4ff9-92ba-54668fffef27-secret-volume\") pod \"collect-profiles-29522250-ddtb9\" (UID: \"f9f0ac4b-5b59-4ff9-92ba-54668fffef27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9" Feb 17 13:31:00 crc kubenswrapper[4804]: I0217 13:31:00.708852 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f9f0ac4b-5b59-4ff9-92ba-54668fffef27-config-volume\") pod \"collect-profiles-29522250-ddtb9\" (UID: \"f9f0ac4b-5b59-4ff9-92ba-54668fffef27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9" Feb 17 13:31:00 crc kubenswrapper[4804]: I0217 13:31:00.729584 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9"] Feb 17 13:31:00 crc kubenswrapper[4804]: I0217 13:31:00.764147 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k"] Feb 17 13:31:00 crc kubenswrapper[4804]: I0217 13:31:00.764356 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" podUID="b710ce8a-f177-4c60-b8d5-bbf18bf38737" containerName="route-controller-manager" containerID="cri-o://1e274226abe5afcd191e577cb5faf8a1c529d1fa501d5a64c7f114b18af605c1" gracePeriod=30 Feb 17 13:31:00 crc kubenswrapper[4804]: I0217 13:31:00.787968 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66f58dbd5-dlsdn"] Feb 17 13:31:00 crc kubenswrapper[4804]: I0217 13:31:00.788180 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" podUID="9631847b-1aa3-4bbd-95d4-cee45d896b11" containerName="controller-manager" containerID="cri-o://cde3dcc2944a4f66700d47606f5e6e510987af0087414aebc458b9e7ef37b2f6" gracePeriod=30 Feb 17 13:31:00 crc kubenswrapper[4804]: I0217 13:31:00.809717 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx798\" (UniqueName: \"kubernetes.io/projected/f9f0ac4b-5b59-4ff9-92ba-54668fffef27-kube-api-access-tx798\") pod \"collect-profiles-29522250-ddtb9\" (UID: \"f9f0ac4b-5b59-4ff9-92ba-54668fffef27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9" Feb 17 13:31:00 crc kubenswrapper[4804]: I0217 13:31:00.809761 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f9f0ac4b-5b59-4ff9-92ba-54668fffef27-secret-volume\") pod \"collect-profiles-29522250-ddtb9\" (UID: \"f9f0ac4b-5b59-4ff9-92ba-54668fffef27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9" Feb 17 13:31:00 crc kubenswrapper[4804]: I0217 13:31:00.809810 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f9f0ac4b-5b59-4ff9-92ba-54668fffef27-config-volume\") pod \"collect-profiles-29522250-ddtb9\" (UID: \"f9f0ac4b-5b59-4ff9-92ba-54668fffef27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9" Feb 17 13:31:00 crc kubenswrapper[4804]: I0217 13:31:00.810689 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f9f0ac4b-5b59-4ff9-92ba-54668fffef27-config-volume\") pod \"collect-profiles-29522250-ddtb9\" (UID: \"f9f0ac4b-5b59-4ff9-92ba-54668fffef27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9" Feb 17 13:31:00 crc kubenswrapper[4804]: I0217 13:31:00.818347 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f9f0ac4b-5b59-4ff9-92ba-54668fffef27-secret-volume\") pod \"collect-profiles-29522250-ddtb9\" (UID: \"f9f0ac4b-5b59-4ff9-92ba-54668fffef27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9" Feb 17 13:31:00 crc kubenswrapper[4804]: I0217 13:31:00.849666 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx798\" (UniqueName: \"kubernetes.io/projected/f9f0ac4b-5b59-4ff9-92ba-54668fffef27-kube-api-access-tx798\") pod \"collect-profiles-29522250-ddtb9\" (UID: \"f9f0ac4b-5b59-4ff9-92ba-54668fffef27\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.014795 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.152979 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.219715 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b710ce8a-f177-4c60-b8d5-bbf18bf38737-serving-cert\") pod \"b710ce8a-f177-4c60-b8d5-bbf18bf38737\" (UID: \"b710ce8a-f177-4c60-b8d5-bbf18bf38737\") " Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.219794 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bxfh\" (UniqueName: \"kubernetes.io/projected/b710ce8a-f177-4c60-b8d5-bbf18bf38737-kube-api-access-7bxfh\") pod \"b710ce8a-f177-4c60-b8d5-bbf18bf38737\" (UID: \"b710ce8a-f177-4c60-b8d5-bbf18bf38737\") " Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.219836 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b710ce8a-f177-4c60-b8d5-bbf18bf38737-config\") pod \"b710ce8a-f177-4c60-b8d5-bbf18bf38737\" (UID: \"b710ce8a-f177-4c60-b8d5-bbf18bf38737\") " Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.219856 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b710ce8a-f177-4c60-b8d5-bbf18bf38737-client-ca\") pod \"b710ce8a-f177-4c60-b8d5-bbf18bf38737\" (UID: \"b710ce8a-f177-4c60-b8d5-bbf18bf38737\") " Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.220975 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b710ce8a-f177-4c60-b8d5-bbf18bf38737-client-ca" (OuterVolumeSpecName: "client-ca") pod "b710ce8a-f177-4c60-b8d5-bbf18bf38737" (UID: "b710ce8a-f177-4c60-b8d5-bbf18bf38737"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.221087 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b710ce8a-f177-4c60-b8d5-bbf18bf38737-config" (OuterVolumeSpecName: "config") pod "b710ce8a-f177-4c60-b8d5-bbf18bf38737" (UID: "b710ce8a-f177-4c60-b8d5-bbf18bf38737"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.227358 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b710ce8a-f177-4c60-b8d5-bbf18bf38737-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b710ce8a-f177-4c60-b8d5-bbf18bf38737" (UID: "b710ce8a-f177-4c60-b8d5-bbf18bf38737"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.233129 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b710ce8a-f177-4c60-b8d5-bbf18bf38737-kube-api-access-7bxfh" (OuterVolumeSpecName: "kube-api-access-7bxfh") pod "b710ce8a-f177-4c60-b8d5-bbf18bf38737" (UID: "b710ce8a-f177-4c60-b8d5-bbf18bf38737"). InnerVolumeSpecName "kube-api-access-7bxfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.268729 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.302161 4804 generic.go:334] "Generic (PLEG): container finished" podID="9631847b-1aa3-4bbd-95d4-cee45d896b11" containerID="cde3dcc2944a4f66700d47606f5e6e510987af0087414aebc458b9e7ef37b2f6" exitCode=0 Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.302235 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" event={"ID":"9631847b-1aa3-4bbd-95d4-cee45d896b11","Type":"ContainerDied","Data":"cde3dcc2944a4f66700d47606f5e6e510987af0087414aebc458b9e7ef37b2f6"} Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.302251 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.302311 4804 scope.go:117] "RemoveContainer" containerID="cde3dcc2944a4f66700d47606f5e6e510987af0087414aebc458b9e7ef37b2f6" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.302295 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66f58dbd5-dlsdn" event={"ID":"9631847b-1aa3-4bbd-95d4-cee45d896b11","Type":"ContainerDied","Data":"2e84da0c7befea7833b925b3ff40e336177c9ccd82633eca63155bf470709de5"} Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.305159 4804 generic.go:334] "Generic (PLEG): container finished" podID="b710ce8a-f177-4c60-b8d5-bbf18bf38737" containerID="1e274226abe5afcd191e577cb5faf8a1c529d1fa501d5a64c7f114b18af605c1" exitCode=0 Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.305302 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" event={"ID":"b710ce8a-f177-4c60-b8d5-bbf18bf38737","Type":"ContainerDied","Data":"1e274226abe5afcd191e577cb5faf8a1c529d1fa501d5a64c7f114b18af605c1"} Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.305331 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" event={"ID":"b710ce8a-f177-4c60-b8d5-bbf18bf38737","Type":"ContainerDied","Data":"558d5dd2eecf846742fd5b4dd243c32953c0fb248ec2faa9cde568927170e4d7"} Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.305585 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.322911 4804 scope.go:117] "RemoveContainer" containerID="cde3dcc2944a4f66700d47606f5e6e510987af0087414aebc458b9e7ef37b2f6" Feb 17 13:31:01 crc kubenswrapper[4804]: E0217 13:31:01.323335 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cde3dcc2944a4f66700d47606f5e6e510987af0087414aebc458b9e7ef37b2f6\": container with ID starting with cde3dcc2944a4f66700d47606f5e6e510987af0087414aebc458b9e7ef37b2f6 not found: ID does not exist" containerID="cde3dcc2944a4f66700d47606f5e6e510987af0087414aebc458b9e7ef37b2f6" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.323362 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cde3dcc2944a4f66700d47606f5e6e510987af0087414aebc458b9e7ef37b2f6"} err="failed to get container status \"cde3dcc2944a4f66700d47606f5e6e510987af0087414aebc458b9e7ef37b2f6\": rpc error: code = NotFound desc = could not find container \"cde3dcc2944a4f66700d47606f5e6e510987af0087414aebc458b9e7ef37b2f6\": container with ID starting with cde3dcc2944a4f66700d47606f5e6e510987af0087414aebc458b9e7ef37b2f6 not found: ID does not exist" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.323383 4804 scope.go:117] "RemoveContainer" containerID="1e274226abe5afcd191e577cb5faf8a1c529d1fa501d5a64c7f114b18af605c1" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.324182 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b710ce8a-f177-4c60-b8d5-bbf18bf38737-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.324239 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bxfh\" (UniqueName: \"kubernetes.io/projected/b710ce8a-f177-4c60-b8d5-bbf18bf38737-kube-api-access-7bxfh\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.324256 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b710ce8a-f177-4c60-b8d5-bbf18bf38737-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.324265 4804 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b710ce8a-f177-4c60-b8d5-bbf18bf38737-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.340678 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k"] Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.345070 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f687946cc-tvs6k"] Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.345128 4804 scope.go:117] "RemoveContainer" containerID="1e274226abe5afcd191e577cb5faf8a1c529d1fa501d5a64c7f114b18af605c1" Feb 17 13:31:01 crc kubenswrapper[4804]: E0217 13:31:01.345576 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e274226abe5afcd191e577cb5faf8a1c529d1fa501d5a64c7f114b18af605c1\": container with ID starting with 1e274226abe5afcd191e577cb5faf8a1c529d1fa501d5a64c7f114b18af605c1 not found: ID does not exist" containerID="1e274226abe5afcd191e577cb5faf8a1c529d1fa501d5a64c7f114b18af605c1" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.345611 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e274226abe5afcd191e577cb5faf8a1c529d1fa501d5a64c7f114b18af605c1"} err="failed to get container status \"1e274226abe5afcd191e577cb5faf8a1c529d1fa501d5a64c7f114b18af605c1\": rpc error: code = NotFound desc = could not find container \"1e274226abe5afcd191e577cb5faf8a1c529d1fa501d5a64c7f114b18af605c1\": container with ID starting with 1e274226abe5afcd191e577cb5faf8a1c529d1fa501d5a64c7f114b18af605c1 not found: ID does not exist" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.426273 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9631847b-1aa3-4bbd-95d4-cee45d896b11-proxy-ca-bundles\") pod \"9631847b-1aa3-4bbd-95d4-cee45d896b11\" (UID: \"9631847b-1aa3-4bbd-95d4-cee45d896b11\") " Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.426360 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9631847b-1aa3-4bbd-95d4-cee45d896b11-config\") pod \"9631847b-1aa3-4bbd-95d4-cee45d896b11\" (UID: \"9631847b-1aa3-4bbd-95d4-cee45d896b11\") " Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.426393 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cpwq\" (UniqueName: \"kubernetes.io/projected/9631847b-1aa3-4bbd-95d4-cee45d896b11-kube-api-access-7cpwq\") pod \"9631847b-1aa3-4bbd-95d4-cee45d896b11\" (UID: \"9631847b-1aa3-4bbd-95d4-cee45d896b11\") " Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.426430 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9631847b-1aa3-4bbd-95d4-cee45d896b11-client-ca\") pod \"9631847b-1aa3-4bbd-95d4-cee45d896b11\" (UID: \"9631847b-1aa3-4bbd-95d4-cee45d896b11\") " Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.426472 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9631847b-1aa3-4bbd-95d4-cee45d896b11-serving-cert\") pod \"9631847b-1aa3-4bbd-95d4-cee45d896b11\" (UID: \"9631847b-1aa3-4bbd-95d4-cee45d896b11\") " Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.427067 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9631847b-1aa3-4bbd-95d4-cee45d896b11-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9631847b-1aa3-4bbd-95d4-cee45d896b11" (UID: "9631847b-1aa3-4bbd-95d4-cee45d896b11"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.427078 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9631847b-1aa3-4bbd-95d4-cee45d896b11-client-ca" (OuterVolumeSpecName: "client-ca") pod "9631847b-1aa3-4bbd-95d4-cee45d896b11" (UID: "9631847b-1aa3-4bbd-95d4-cee45d896b11"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.427176 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9631847b-1aa3-4bbd-95d4-cee45d896b11-config" (OuterVolumeSpecName: "config") pod "9631847b-1aa3-4bbd-95d4-cee45d896b11" (UID: "9631847b-1aa3-4bbd-95d4-cee45d896b11"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.429325 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9631847b-1aa3-4bbd-95d4-cee45d896b11-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9631847b-1aa3-4bbd-95d4-cee45d896b11" (UID: "9631847b-1aa3-4bbd-95d4-cee45d896b11"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.431804 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9631847b-1aa3-4bbd-95d4-cee45d896b11-kube-api-access-7cpwq" (OuterVolumeSpecName: "kube-api-access-7cpwq") pod "9631847b-1aa3-4bbd-95d4-cee45d896b11" (UID: "9631847b-1aa3-4bbd-95d4-cee45d896b11"). InnerVolumeSpecName "kube-api-access-7cpwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.527939 4804 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9631847b-1aa3-4bbd-95d4-cee45d896b11-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.527976 4804 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9631847b-1aa3-4bbd-95d4-cee45d896b11-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.527988 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9631847b-1aa3-4bbd-95d4-cee45d896b11-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.527996 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cpwq\" (UniqueName: \"kubernetes.io/projected/9631847b-1aa3-4bbd-95d4-cee45d896b11-kube-api-access-7cpwq\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.528006 4804 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9631847b-1aa3-4bbd-95d4-cee45d896b11-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.596836 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9"] Feb 17 13:31:01 crc kubenswrapper[4804]: W0217 13:31:01.599303 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9f0ac4b_5b59_4ff9_92ba_54668fffef27.slice/crio-c1e49c136c4dbba235deb6639e43d6f45ce9dccf72695e9e4f2c1d14c70c1566 WatchSource:0}: Error finding container c1e49c136c4dbba235deb6639e43d6f45ce9dccf72695e9e4f2c1d14c70c1566: Status 404 returned error can't find the container with id c1e49c136c4dbba235deb6639e43d6f45ce9dccf72695e9e4f2c1d14c70c1566 Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.629337 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66f58dbd5-dlsdn"] Feb 17 13:31:01 crc kubenswrapper[4804]: I0217 13:31:01.635386 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-66f58dbd5-dlsdn"] Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.315050 4804 generic.go:334] "Generic (PLEG): container finished" podID="f9f0ac4b-5b59-4ff9-92ba-54668fffef27" containerID="c63647c4f782e7514611e89775cb3101cab0f160b6675c0b2e9972791cd22306" exitCode=0 Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.315161 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9" event={"ID":"f9f0ac4b-5b59-4ff9-92ba-54668fffef27","Type":"ContainerDied","Data":"c63647c4f782e7514611e89775cb3101cab0f160b6675c0b2e9972791cd22306"} Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.315400 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9" event={"ID":"f9f0ac4b-5b59-4ff9-92ba-54668fffef27","Type":"ContainerStarted","Data":"c1e49c136c4dbba235deb6639e43d6f45ce9dccf72695e9e4f2c1d14c70c1566"} Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.550758 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n"] Feb 17 13:31:02 crc kubenswrapper[4804]: E0217 13:31:02.551189 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b710ce8a-f177-4c60-b8d5-bbf18bf38737" containerName="route-controller-manager" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.551262 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="b710ce8a-f177-4c60-b8d5-bbf18bf38737" containerName="route-controller-manager" Feb 17 13:31:02 crc kubenswrapper[4804]: E0217 13:31:02.551303 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9631847b-1aa3-4bbd-95d4-cee45d896b11" containerName="controller-manager" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.551325 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="9631847b-1aa3-4bbd-95d4-cee45d896b11" containerName="controller-manager" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.551583 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="9631847b-1aa3-4bbd-95d4-cee45d896b11" containerName="controller-manager" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.551630 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="b710ce8a-f177-4c60-b8d5-bbf18bf38737" containerName="route-controller-manager" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.552463 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.554455 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.554656 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.554746 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.555087 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.555534 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.555834 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.556990 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6df4db785d-ddhq7"] Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.558024 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.561133 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n"] Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.562479 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.562643 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.563463 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.568053 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6df4db785d-ddhq7"] Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.568180 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.568322 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.568497 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.568746 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.599054 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9631847b-1aa3-4bbd-95d4-cee45d896b11" path="/var/lib/kubelet/pods/9631847b-1aa3-4bbd-95d4-cee45d896b11/volumes" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.599672 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b710ce8a-f177-4c60-b8d5-bbf18bf38737" path="/var/lib/kubelet/pods/b710ce8a-f177-4c60-b8d5-bbf18bf38737/volumes" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.642877 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qc9n\" (UniqueName: \"kubernetes.io/projected/b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180-kube-api-access-4qc9n\") pod \"route-controller-manager-5b87cd88c-9bg4n\" (UID: \"b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180\") " pod="openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.642942 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180-config\") pod \"route-controller-manager-5b87cd88c-9bg4n\" (UID: \"b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180\") " pod="openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.642968 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180-serving-cert\") pod \"route-controller-manager-5b87cd88c-9bg4n\" (UID: \"b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180\") " pod="openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.643007 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e26901d5-e751-441b-9453-27e1f001a3a9-proxy-ca-bundles\") pod \"controller-manager-6df4db785d-ddhq7\" (UID: \"e26901d5-e751-441b-9453-27e1f001a3a9\") " pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.643030 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180-client-ca\") pod \"route-controller-manager-5b87cd88c-9bg4n\" (UID: \"b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180\") " pod="openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.643113 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e26901d5-e751-441b-9453-27e1f001a3a9-client-ca\") pod \"controller-manager-6df4db785d-ddhq7\" (UID: \"e26901d5-e751-441b-9453-27e1f001a3a9\") " pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.643233 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmlns\" (UniqueName: \"kubernetes.io/projected/e26901d5-e751-441b-9453-27e1f001a3a9-kube-api-access-wmlns\") pod \"controller-manager-6df4db785d-ddhq7\" (UID: \"e26901d5-e751-441b-9453-27e1f001a3a9\") " pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.643314 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e26901d5-e751-441b-9453-27e1f001a3a9-config\") pod \"controller-manager-6df4db785d-ddhq7\" (UID: \"e26901d5-e751-441b-9453-27e1f001a3a9\") " pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.643359 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e26901d5-e751-441b-9453-27e1f001a3a9-serving-cert\") pod \"controller-manager-6df4db785d-ddhq7\" (UID: \"e26901d5-e751-441b-9453-27e1f001a3a9\") " pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.743917 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e26901d5-e751-441b-9453-27e1f001a3a9-client-ca\") pod \"controller-manager-6df4db785d-ddhq7\" (UID: \"e26901d5-e751-441b-9453-27e1f001a3a9\") " pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.743997 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmlns\" (UniqueName: \"kubernetes.io/projected/e26901d5-e751-441b-9453-27e1f001a3a9-kube-api-access-wmlns\") pod \"controller-manager-6df4db785d-ddhq7\" (UID: \"e26901d5-e751-441b-9453-27e1f001a3a9\") " pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.744038 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e26901d5-e751-441b-9453-27e1f001a3a9-config\") pod \"controller-manager-6df4db785d-ddhq7\" (UID: \"e26901d5-e751-441b-9453-27e1f001a3a9\") " pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.744065 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e26901d5-e751-441b-9453-27e1f001a3a9-serving-cert\") pod \"controller-manager-6df4db785d-ddhq7\" (UID: \"e26901d5-e751-441b-9453-27e1f001a3a9\") " pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.744103 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qc9n\" (UniqueName: \"kubernetes.io/projected/b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180-kube-api-access-4qc9n\") pod \"route-controller-manager-5b87cd88c-9bg4n\" (UID: \"b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180\") " pod="openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.744142 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180-config\") pod \"route-controller-manager-5b87cd88c-9bg4n\" (UID: \"b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180\") " pod="openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.744168 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180-serving-cert\") pod \"route-controller-manager-5b87cd88c-9bg4n\" (UID: \"b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180\") " pod="openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.744238 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e26901d5-e751-441b-9453-27e1f001a3a9-proxy-ca-bundles\") pod \"controller-manager-6df4db785d-ddhq7\" (UID: \"e26901d5-e751-441b-9453-27e1f001a3a9\") " pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.744271 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180-client-ca\") pod \"route-controller-manager-5b87cd88c-9bg4n\" (UID: \"b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180\") " pod="openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.744965 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e26901d5-e751-441b-9453-27e1f001a3a9-client-ca\") pod \"controller-manager-6df4db785d-ddhq7\" (UID: \"e26901d5-e751-441b-9453-27e1f001a3a9\") " pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.745061 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180-client-ca\") pod \"route-controller-manager-5b87cd88c-9bg4n\" (UID: \"b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180\") " pod="openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.745179 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180-config\") pod \"route-controller-manager-5b87cd88c-9bg4n\" (UID: \"b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180\") " pod="openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.745268 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e26901d5-e751-441b-9453-27e1f001a3a9-proxy-ca-bundles\") pod \"controller-manager-6df4db785d-ddhq7\" (UID: \"e26901d5-e751-441b-9453-27e1f001a3a9\") " pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.746139 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e26901d5-e751-441b-9453-27e1f001a3a9-config\") pod \"controller-manager-6df4db785d-ddhq7\" (UID: \"e26901d5-e751-441b-9453-27e1f001a3a9\") " pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.752009 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e26901d5-e751-441b-9453-27e1f001a3a9-serving-cert\") pod \"controller-manager-6df4db785d-ddhq7\" (UID: \"e26901d5-e751-441b-9453-27e1f001a3a9\") " pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.754722 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180-serving-cert\") pod \"route-controller-manager-5b87cd88c-9bg4n\" (UID: \"b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180\") " pod="openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.761720 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qc9n\" (UniqueName: \"kubernetes.io/projected/b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180-kube-api-access-4qc9n\") pod \"route-controller-manager-5b87cd88c-9bg4n\" (UID: \"b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180\") " pod="openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.764564 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmlns\" (UniqueName: \"kubernetes.io/projected/e26901d5-e751-441b-9453-27e1f001a3a9-kube-api-access-wmlns\") pod \"controller-manager-6df4db785d-ddhq7\" (UID: \"e26901d5-e751-441b-9453-27e1f001a3a9\") " pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.914740 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n" Feb 17 13:31:02 crc kubenswrapper[4804]: I0217 13:31:02.926383 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" Feb 17 13:31:03 crc kubenswrapper[4804]: I0217 13:31:03.115138 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6df4db785d-ddhq7"] Feb 17 13:31:03 crc kubenswrapper[4804]: I0217 13:31:03.218708 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n"] Feb 17 13:31:03 crc kubenswrapper[4804]: I0217 13:31:03.323611 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n" event={"ID":"b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180","Type":"ContainerStarted","Data":"5e6b651d76cfcfa9eea105d262e1df3063a9ffad4a7ecea8637f0a180b1ba235"} Feb 17 13:31:03 crc kubenswrapper[4804]: I0217 13:31:03.326432 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" event={"ID":"e26901d5-e751-441b-9453-27e1f001a3a9","Type":"ContainerStarted","Data":"ad729f732fd481db4774293eb194a2fac94384e1fb71d6667ddff8d3803accf2"} Feb 17 13:31:03 crc kubenswrapper[4804]: I0217 13:31:03.326485 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" event={"ID":"e26901d5-e751-441b-9453-27e1f001a3a9","Type":"ContainerStarted","Data":"cfb009c7a5ab1bf7984ceca99d0eef609ed38297bd6e80482b73b23defa4ef9a"} Feb 17 13:31:03 crc kubenswrapper[4804]: I0217 13:31:03.354475 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" podStartSLOduration=3.35445174 podStartE2EDuration="3.35445174s" podCreationTimestamp="2026-02-17 13:31:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:31:03.340579632 +0000 UTC m=+337.451998989" watchObservedRunningTime="2026-02-17 13:31:03.35445174 +0000 UTC m=+337.465871067" Feb 17 13:31:03 crc kubenswrapper[4804]: I0217 13:31:03.518490 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9" Feb 17 13:31:03 crc kubenswrapper[4804]: I0217 13:31:03.559160 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f9f0ac4b-5b59-4ff9-92ba-54668fffef27-secret-volume\") pod \"f9f0ac4b-5b59-4ff9-92ba-54668fffef27\" (UID: \"f9f0ac4b-5b59-4ff9-92ba-54668fffef27\") " Feb 17 13:31:03 crc kubenswrapper[4804]: I0217 13:31:03.559277 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f9f0ac4b-5b59-4ff9-92ba-54668fffef27-config-volume\") pod \"f9f0ac4b-5b59-4ff9-92ba-54668fffef27\" (UID: \"f9f0ac4b-5b59-4ff9-92ba-54668fffef27\") " Feb 17 13:31:03 crc kubenswrapper[4804]: I0217 13:31:03.559318 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tx798\" (UniqueName: \"kubernetes.io/projected/f9f0ac4b-5b59-4ff9-92ba-54668fffef27-kube-api-access-tx798\") pod \"f9f0ac4b-5b59-4ff9-92ba-54668fffef27\" (UID: \"f9f0ac4b-5b59-4ff9-92ba-54668fffef27\") " Feb 17 13:31:03 crc kubenswrapper[4804]: I0217 13:31:03.560160 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9f0ac4b-5b59-4ff9-92ba-54668fffef27-config-volume" (OuterVolumeSpecName: "config-volume") pod "f9f0ac4b-5b59-4ff9-92ba-54668fffef27" (UID: "f9f0ac4b-5b59-4ff9-92ba-54668fffef27"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:31:03 crc kubenswrapper[4804]: I0217 13:31:03.568808 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9f0ac4b-5b59-4ff9-92ba-54668fffef27-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f9f0ac4b-5b59-4ff9-92ba-54668fffef27" (UID: "f9f0ac4b-5b59-4ff9-92ba-54668fffef27"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:31:03 crc kubenswrapper[4804]: I0217 13:31:03.572378 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9f0ac4b-5b59-4ff9-92ba-54668fffef27-kube-api-access-tx798" (OuterVolumeSpecName: "kube-api-access-tx798") pod "f9f0ac4b-5b59-4ff9-92ba-54668fffef27" (UID: "f9f0ac4b-5b59-4ff9-92ba-54668fffef27"). InnerVolumeSpecName "kube-api-access-tx798". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:31:03 crc kubenswrapper[4804]: I0217 13:31:03.660743 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tx798\" (UniqueName: \"kubernetes.io/projected/f9f0ac4b-5b59-4ff9-92ba-54668fffef27-kube-api-access-tx798\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:03 crc kubenswrapper[4804]: I0217 13:31:03.660780 4804 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f9f0ac4b-5b59-4ff9-92ba-54668fffef27-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:03 crc kubenswrapper[4804]: I0217 13:31:03.660789 4804 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f9f0ac4b-5b59-4ff9-92ba-54668fffef27-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:04 crc kubenswrapper[4804]: I0217 13:31:04.333110 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9" Feb 17 13:31:04 crc kubenswrapper[4804]: I0217 13:31:04.333106 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9" event={"ID":"f9f0ac4b-5b59-4ff9-92ba-54668fffef27","Type":"ContainerDied","Data":"c1e49c136c4dbba235deb6639e43d6f45ce9dccf72695e9e4f2c1d14c70c1566"} Feb 17 13:31:04 crc kubenswrapper[4804]: I0217 13:31:04.333474 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1e49c136c4dbba235deb6639e43d6f45ce9dccf72695e9e4f2c1d14c70c1566" Feb 17 13:31:04 crc kubenswrapper[4804]: I0217 13:31:04.335365 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n" event={"ID":"b2f9fd9e-7c48-4ec8-b1d6-89d4ab4c6180","Type":"ContainerStarted","Data":"6a48ec29f3734b19aebf029005af544ceafafc0ba49ac1f3e29a6dbbe82c4dbd"} Feb 17 13:31:04 crc kubenswrapper[4804]: I0217 13:31:04.335581 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" Feb 17 13:31:04 crc kubenswrapper[4804]: I0217 13:31:04.335734 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n" Feb 17 13:31:04 crc kubenswrapper[4804]: I0217 13:31:04.343773 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6df4db785d-ddhq7" Feb 17 13:31:04 crc kubenswrapper[4804]: I0217 13:31:04.346754 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n" Feb 17 13:31:04 crc kubenswrapper[4804]: I0217 13:31:04.354513 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5b87cd88c-9bg4n" podStartSLOduration=4.354498187 podStartE2EDuration="4.354498187s" podCreationTimestamp="2026-02-17 13:31:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:31:04.35228031 +0000 UTC m=+338.463699647" watchObservedRunningTime="2026-02-17 13:31:04.354498187 +0000 UTC m=+338.465917524" Feb 17 13:31:25 crc kubenswrapper[4804]: I0217 13:31:25.835396 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:31:25 crc kubenswrapper[4804]: I0217 13:31:25.836089 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.578034 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-tzg5s"] Feb 17 13:31:45 crc kubenswrapper[4804]: E0217 13:31:45.578792 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9f0ac4b-5b59-4ff9-92ba-54668fffef27" containerName="collect-profiles" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.578806 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9f0ac4b-5b59-4ff9-92ba-54668fffef27" containerName="collect-profiles" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.578901 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9f0ac4b-5b59-4ff9-92ba-54668fffef27" containerName="collect-profiles" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.579293 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.599801 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-tzg5s"] Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.608895 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zdcb\" (UniqueName: \"kubernetes.io/projected/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-kube-api-access-9zdcb\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.608970 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-bound-sa-token\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.609013 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-installation-pull-secrets\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.609056 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-trusted-ca\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.609108 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.609185 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-ca-trust-extracted\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.609242 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-registry-tls\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.609281 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-registry-certificates\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.620107 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hpw7w"] Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.623615 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hpw7w" podUID="cbda9f29-b199-4a42-8757-f5ecc90f0437" containerName="registry-server" containerID="cri-o://0d6529764fb52a660f73a1c0bf0bbd57d05ce1c0989e42faa62fceb8a8e28c77" gracePeriod=30 Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.624576 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-54w49"] Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.624764 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-54w49" podUID="5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df" containerName="registry-server" containerID="cri-o://b1df388928454341d2cfda489583d80c64925db6adb630d1962fca880505bbbd" gracePeriod=30 Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.642933 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6k2g8"] Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.643240 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" podUID="2ce6eded-da13-4bb7-a87d-71b87d0e7f06" containerName="marketplace-operator" containerID="cri-o://249b979edb84b2aac9f6e8c09057b08c5a469e66d87734e7e060d0b2622d365d" gracePeriod=30 Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.648067 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fvtl6"] Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.648312 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fvtl6" podUID="6a10f4e7-7906-43aa-98fb-e709a71a55d2" containerName="registry-server" containerID="cri-o://9895e70116d773812fe7c81811ca37d9cf3877a5e4c2894683701b841e72852b" gracePeriod=30 Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.654310 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xf58f"] Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.654795 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xf58f" podUID="4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5" containerName="registry-server" containerID="cri-o://6f5ffb12c26b0fccbe16222c85dcb4bbe8bbf2181d09e1a07e43c582de974240" gracePeriod=30 Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.668317 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.677408 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-26cwx"] Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.677990 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-26cwx" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.689432 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-26cwx"] Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.710250 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zdcb\" (UniqueName: \"kubernetes.io/projected/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-kube-api-access-9zdcb\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.710291 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-bound-sa-token\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.710311 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-installation-pull-secrets\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.710345 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-trusted-ca\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.710360 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-ca-trust-extracted\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.710379 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-registry-tls\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.710398 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-registry-certificates\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.711762 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-registry-certificates\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.713767 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-ca-trust-extracted\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.714651 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-trusted-ca\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.718348 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-installation-pull-secrets\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.718782 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-registry-tls\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.728680 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-bound-sa-token\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.728765 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zdcb\" (UniqueName: \"kubernetes.io/projected/556a721b-bf87-43d3-9d93-fabcb7f8f1b0-kube-api-access-9zdcb\") pod \"image-registry-66df7c8f76-tzg5s\" (UID: \"556a721b-bf87-43d3-9d93-fabcb7f8f1b0\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.812234 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqn79\" (UniqueName: \"kubernetes.io/projected/78a56ea9-6641-4d2d-8471-b40e5f2cf7e5-kube-api-access-qqn79\") pod \"marketplace-operator-79b997595-26cwx\" (UID: \"78a56ea9-6641-4d2d-8471-b40e5f2cf7e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-26cwx" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.812641 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/78a56ea9-6641-4d2d-8471-b40e5f2cf7e5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-26cwx\" (UID: \"78a56ea9-6641-4d2d-8471-b40e5f2cf7e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-26cwx" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.816627 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78a56ea9-6641-4d2d-8471-b40e5f2cf7e5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-26cwx\" (UID: \"78a56ea9-6641-4d2d-8471-b40e5f2cf7e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-26cwx" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.898343 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.917408 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78a56ea9-6641-4d2d-8471-b40e5f2cf7e5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-26cwx\" (UID: \"78a56ea9-6641-4d2d-8471-b40e5f2cf7e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-26cwx" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.917496 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqn79\" (UniqueName: \"kubernetes.io/projected/78a56ea9-6641-4d2d-8471-b40e5f2cf7e5-kube-api-access-qqn79\") pod \"marketplace-operator-79b997595-26cwx\" (UID: \"78a56ea9-6641-4d2d-8471-b40e5f2cf7e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-26cwx" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.917516 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/78a56ea9-6641-4d2d-8471-b40e5f2cf7e5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-26cwx\" (UID: \"78a56ea9-6641-4d2d-8471-b40e5f2cf7e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-26cwx" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.919545 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78a56ea9-6641-4d2d-8471-b40e5f2cf7e5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-26cwx\" (UID: \"78a56ea9-6641-4d2d-8471-b40e5f2cf7e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-26cwx" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.920967 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/78a56ea9-6641-4d2d-8471-b40e5f2cf7e5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-26cwx\" (UID: \"78a56ea9-6641-4d2d-8471-b40e5f2cf7e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-26cwx" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.936262 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqn79\" (UniqueName: \"kubernetes.io/projected/78a56ea9-6641-4d2d-8471-b40e5f2cf7e5-kube-api-access-qqn79\") pod \"marketplace-operator-79b997595-26cwx\" (UID: \"78a56ea9-6641-4d2d-8471-b40e5f2cf7e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-26cwx" Feb 17 13:31:45 crc kubenswrapper[4804]: I0217 13:31:45.992994 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-26cwx" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.110734 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fvtl6" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.122005 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xf58f" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.129855 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.139070 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hpw7w" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.222707 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdxzj\" (UniqueName: \"kubernetes.io/projected/6a10f4e7-7906-43aa-98fb-e709a71a55d2-kube-api-access-zdxzj\") pod \"6a10f4e7-7906-43aa-98fb-e709a71a55d2\" (UID: \"6a10f4e7-7906-43aa-98fb-e709a71a55d2\") " Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.222811 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a10f4e7-7906-43aa-98fb-e709a71a55d2-utilities\") pod \"6a10f4e7-7906-43aa-98fb-e709a71a55d2\" (UID: \"6a10f4e7-7906-43aa-98fb-e709a71a55d2\") " Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.222887 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a10f4e7-7906-43aa-98fb-e709a71a55d2-catalog-content\") pod \"6a10f4e7-7906-43aa-98fb-e709a71a55d2\" (UID: \"6a10f4e7-7906-43aa-98fb-e709a71a55d2\") " Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.223997 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a10f4e7-7906-43aa-98fb-e709a71a55d2-utilities" (OuterVolumeSpecName: "utilities") pod "6a10f4e7-7906-43aa-98fb-e709a71a55d2" (UID: "6a10f4e7-7906-43aa-98fb-e709a71a55d2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.233361 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a10f4e7-7906-43aa-98fb-e709a71a55d2-kube-api-access-zdxzj" (OuterVolumeSpecName: "kube-api-access-zdxzj") pod "6a10f4e7-7906-43aa-98fb-e709a71a55d2" (UID: "6a10f4e7-7906-43aa-98fb-e709a71a55d2"). InnerVolumeSpecName "kube-api-access-zdxzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.249321 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a10f4e7-7906-43aa-98fb-e709a71a55d2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6a10f4e7-7906-43aa-98fb-e709a71a55d2" (UID: "6a10f4e7-7906-43aa-98fb-e709a71a55d2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.324057 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbda9f29-b199-4a42-8757-f5ecc90f0437-catalog-content\") pod \"cbda9f29-b199-4a42-8757-f5ecc90f0437\" (UID: \"cbda9f29-b199-4a42-8757-f5ecc90f0437\") " Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.324158 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5-catalog-content\") pod \"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5\" (UID: \"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5\") " Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.324312 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2ce6eded-da13-4bb7-a87d-71b87d0e7f06-marketplace-trusted-ca\") pod \"2ce6eded-da13-4bb7-a87d-71b87d0e7f06\" (UID: \"2ce6eded-da13-4bb7-a87d-71b87d0e7f06\") " Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.324431 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4gxw\" (UniqueName: \"kubernetes.io/projected/2ce6eded-da13-4bb7-a87d-71b87d0e7f06-kube-api-access-x4gxw\") pod \"2ce6eded-da13-4bb7-a87d-71b87d0e7f06\" (UID: \"2ce6eded-da13-4bb7-a87d-71b87d0e7f06\") " Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.324508 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6cwx\" (UniqueName: \"kubernetes.io/projected/cbda9f29-b199-4a42-8757-f5ecc90f0437-kube-api-access-g6cwx\") pod \"cbda9f29-b199-4a42-8757-f5ecc90f0437\" (UID: \"cbda9f29-b199-4a42-8757-f5ecc90f0437\") " Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.324538 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5-utilities\") pod \"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5\" (UID: \"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5\") " Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.324562 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbda9f29-b199-4a42-8757-f5ecc90f0437-utilities\") pod \"cbda9f29-b199-4a42-8757-f5ecc90f0437\" (UID: \"cbda9f29-b199-4a42-8757-f5ecc90f0437\") " Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.324600 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pm9gj\" (UniqueName: \"kubernetes.io/projected/4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5-kube-api-access-pm9gj\") pod \"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5\" (UID: \"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5\") " Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.324649 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2ce6eded-da13-4bb7-a87d-71b87d0e7f06-marketplace-operator-metrics\") pod \"2ce6eded-da13-4bb7-a87d-71b87d0e7f06\" (UID: \"2ce6eded-da13-4bb7-a87d-71b87d0e7f06\") " Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.324953 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a10f4e7-7906-43aa-98fb-e709a71a55d2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.324971 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdxzj\" (UniqueName: \"kubernetes.io/projected/6a10f4e7-7906-43aa-98fb-e709a71a55d2-kube-api-access-zdxzj\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.324985 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a10f4e7-7906-43aa-98fb-e709a71a55d2-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.326933 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ce6eded-da13-4bb7-a87d-71b87d0e7f06-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "2ce6eded-da13-4bb7-a87d-71b87d0e7f06" (UID: "2ce6eded-da13-4bb7-a87d-71b87d0e7f06"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.328134 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbda9f29-b199-4a42-8757-f5ecc90f0437-utilities" (OuterVolumeSpecName: "utilities") pod "cbda9f29-b199-4a42-8757-f5ecc90f0437" (UID: "cbda9f29-b199-4a42-8757-f5ecc90f0437"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.328254 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5-utilities" (OuterVolumeSpecName: "utilities") pod "4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5" (UID: "4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.331931 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ce6eded-da13-4bb7-a87d-71b87d0e7f06-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "2ce6eded-da13-4bb7-a87d-71b87d0e7f06" (UID: "2ce6eded-da13-4bb7-a87d-71b87d0e7f06"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.332060 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbda9f29-b199-4a42-8757-f5ecc90f0437-kube-api-access-g6cwx" (OuterVolumeSpecName: "kube-api-access-g6cwx") pod "cbda9f29-b199-4a42-8757-f5ecc90f0437" (UID: "cbda9f29-b199-4a42-8757-f5ecc90f0437"). InnerVolumeSpecName "kube-api-access-g6cwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.332243 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ce6eded-da13-4bb7-a87d-71b87d0e7f06-kube-api-access-x4gxw" (OuterVolumeSpecName: "kube-api-access-x4gxw") pod "2ce6eded-da13-4bb7-a87d-71b87d0e7f06" (UID: "2ce6eded-da13-4bb7-a87d-71b87d0e7f06"). InnerVolumeSpecName "kube-api-access-x4gxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.333018 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5-kube-api-access-pm9gj" (OuterVolumeSpecName: "kube-api-access-pm9gj") pod "4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5" (UID: "4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5"). InnerVolumeSpecName "kube-api-access-pm9gj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.363726 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-tzg5s"] Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.405772 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbda9f29-b199-4a42-8757-f5ecc90f0437-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cbda9f29-b199-4a42-8757-f5ecc90f0437" (UID: "cbda9f29-b199-4a42-8757-f5ecc90f0437"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.426284 4804 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2ce6eded-da13-4bb7-a87d-71b87d0e7f06-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.426326 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbda9f29-b199-4a42-8757-f5ecc90f0437-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.426350 4804 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2ce6eded-da13-4bb7-a87d-71b87d0e7f06-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.426360 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4gxw\" (UniqueName: \"kubernetes.io/projected/2ce6eded-da13-4bb7-a87d-71b87d0e7f06-kube-api-access-x4gxw\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.426370 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6cwx\" (UniqueName: \"kubernetes.io/projected/cbda9f29-b199-4a42-8757-f5ecc90f0437-kube-api-access-g6cwx\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.426380 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.426388 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbda9f29-b199-4a42-8757-f5ecc90f0437-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.426396 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pm9gj\" (UniqueName: \"kubernetes.io/projected/4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5-kube-api-access-pm9gj\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.458914 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-26cwx"] Feb 17 13:31:46 crc kubenswrapper[4804]: W0217 13:31:46.472054 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78a56ea9_6641_4d2d_8471_b40e5f2cf7e5.slice/crio-588e9c2b27f2947186cfe36096e284468ba847f94f45613c4ddfd2ff4b3a556d WatchSource:0}: Error finding container 588e9c2b27f2947186cfe36096e284468ba847f94f45613c4ddfd2ff4b3a556d: Status 404 returned error can't find the container with id 588e9c2b27f2947186cfe36096e284468ba847f94f45613c4ddfd2ff4b3a556d Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.476487 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5" (UID: "4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.488393 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-54w49" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.527081 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.580215 4804 generic.go:334] "Generic (PLEG): container finished" podID="5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df" containerID="b1df388928454341d2cfda489583d80c64925db6adb630d1962fca880505bbbd" exitCode=0 Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.580312 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-54w49" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.584925 4804 generic.go:334] "Generic (PLEG): container finished" podID="2ce6eded-da13-4bb7-a87d-71b87d0e7f06" containerID="249b979edb84b2aac9f6e8c09057b08c5a469e66d87734e7e060d0b2622d365d" exitCode=0 Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.585013 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.588403 4804 generic.go:334] "Generic (PLEG): container finished" podID="6a10f4e7-7906-43aa-98fb-e709a71a55d2" containerID="9895e70116d773812fe7c81811ca37d9cf3877a5e4c2894683701b841e72852b" exitCode=0 Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.588572 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fvtl6" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.591383 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.596152 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54w49" event={"ID":"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df","Type":"ContainerDied","Data":"b1df388928454341d2cfda489583d80c64925db6adb630d1962fca880505bbbd"} Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.596286 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54w49" event={"ID":"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df","Type":"ContainerDied","Data":"14bd0e0c6146aca8722f654770d91415f769ddfe462bd310b48fc23e91722dce"} Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.596378 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" event={"ID":"556a721b-bf87-43d3-9d93-fabcb7f8f1b0","Type":"ContainerStarted","Data":"1e3905e44f97c52e95e55a0e174cf9f2ec9e9413b7bb547239857c4e21e540c5"} Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.596458 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" event={"ID":"556a721b-bf87-43d3-9d93-fabcb7f8f1b0","Type":"ContainerStarted","Data":"dd70416b24d05181ecf15f4993d0ecac08d8523b5d8892f161f79eac2cb31ba9"} Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.596539 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-26cwx" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.596629 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" event={"ID":"2ce6eded-da13-4bb7-a87d-71b87d0e7f06","Type":"ContainerDied","Data":"249b979edb84b2aac9f6e8c09057b08c5a469e66d87734e7e060d0b2622d365d"} Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.596713 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6k2g8" event={"ID":"2ce6eded-da13-4bb7-a87d-71b87d0e7f06","Type":"ContainerDied","Data":"8d3bbbb9c8ddaebadf3050ba63a4409fb724b92775f2af121beab0c80c2020a4"} Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.596796 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fvtl6" event={"ID":"6a10f4e7-7906-43aa-98fb-e709a71a55d2","Type":"ContainerDied","Data":"9895e70116d773812fe7c81811ca37d9cf3877a5e4c2894683701b841e72852b"} Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.596881 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fvtl6" event={"ID":"6a10f4e7-7906-43aa-98fb-e709a71a55d2","Type":"ContainerDied","Data":"122644669fc551cce79300f93153f1ee66ee7078e3af8dcd19bd62ec42ba0f74"} Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.596978 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-26cwx" event={"ID":"78a56ea9-6641-4d2d-8471-b40e5f2cf7e5","Type":"ContainerStarted","Data":"f063158903793c873968f2a56861ba5637643358caadd6057e031c9e3fa7390d"} Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.597074 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-26cwx" event={"ID":"78a56ea9-6641-4d2d-8471-b40e5f2cf7e5","Type":"ContainerStarted","Data":"588e9c2b27f2947186cfe36096e284468ba847f94f45613c4ddfd2ff4b3a556d"} Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.597163 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xf58f" event={"ID":"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5","Type":"ContainerDied","Data":"6f5ffb12c26b0fccbe16222c85dcb4bbe8bbf2181d09e1a07e43c582de974240"} Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.594022 4804 generic.go:334] "Generic (PLEG): container finished" podID="4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5" containerID="6f5ffb12c26b0fccbe16222c85dcb4bbe8bbf2181d09e1a07e43c582de974240" exitCode=0 Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.592500 4804 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-26cwx container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.65:8080/healthz\": dial tcp 10.217.0.65:8080: connect: connection refused" start-of-body= Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.597467 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xf58f" event={"ID":"4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5","Type":"ContainerDied","Data":"6c2639b1b465093d91b07ae1fd7b695d64615f297ec3d0a8c5e28adb5bb00161"} Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.597219 4804 scope.go:117] "RemoveContainer" containerID="b1df388928454341d2cfda489583d80c64925db6adb630d1962fca880505bbbd" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.594122 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xf58f" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.597702 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-26cwx" podUID="78a56ea9-6641-4d2d-8471-b40e5f2cf7e5" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.65:8080/healthz\": dial tcp 10.217.0.65:8080: connect: connection refused" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.600169 4804 generic.go:334] "Generic (PLEG): container finished" podID="cbda9f29-b199-4a42-8757-f5ecc90f0437" containerID="0d6529764fb52a660f73a1c0bf0bbd57d05ce1c0989e42faa62fceb8a8e28c77" exitCode=0 Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.600228 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpw7w" event={"ID":"cbda9f29-b199-4a42-8757-f5ecc90f0437","Type":"ContainerDied","Data":"0d6529764fb52a660f73a1c0bf0bbd57d05ce1c0989e42faa62fceb8a8e28c77"} Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.600253 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpw7w" event={"ID":"cbda9f29-b199-4a42-8757-f5ecc90f0437","Type":"ContainerDied","Data":"f8fddc3c1f1b98532bbecd6c7da5c2a2368e8ed8a3bd8f6f7983638879bf50a9"} Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.600327 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hpw7w" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.616817 4804 scope.go:117] "RemoveContainer" containerID="b18e486af5e04ed68c256a5f17c4e5bc48b129f17f0b27df8fa47ac15336cfef" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.628176 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df-catalog-content\") pod \"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df\" (UID: \"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df\") " Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.628440 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmjlm\" (UniqueName: \"kubernetes.io/projected/5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df-kube-api-access-rmjlm\") pod \"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df\" (UID: \"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df\") " Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.628623 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df-utilities\") pod \"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df\" (UID: \"5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df\") " Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.630735 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df-utilities" (OuterVolumeSpecName: "utilities") pod "5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df" (UID: "5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.649365 4804 scope.go:117] "RemoveContainer" containerID="631e1502ce58a957ecc7f80d4be00f1ff4ee5647304073d92dd226e1e956e02c" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.652449 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df-kube-api-access-rmjlm" (OuterVolumeSpecName: "kube-api-access-rmjlm") pod "5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df" (UID: "5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df"). InnerVolumeSpecName "kube-api-access-rmjlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.672773 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" podStartSLOduration=1.672751679 podStartE2EDuration="1.672751679s" podCreationTimestamp="2026-02-17 13:31:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:31:46.672451351 +0000 UTC m=+380.783870688" watchObservedRunningTime="2026-02-17 13:31:46.672751679 +0000 UTC m=+380.784171016" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.700404 4804 scope.go:117] "RemoveContainer" containerID="b1df388928454341d2cfda489583d80c64925db6adb630d1962fca880505bbbd" Feb 17 13:31:46 crc kubenswrapper[4804]: E0217 13:31:46.703754 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1df388928454341d2cfda489583d80c64925db6adb630d1962fca880505bbbd\": container with ID starting with b1df388928454341d2cfda489583d80c64925db6adb630d1962fca880505bbbd not found: ID does not exist" containerID="b1df388928454341d2cfda489583d80c64925db6adb630d1962fca880505bbbd" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.703817 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1df388928454341d2cfda489583d80c64925db6adb630d1962fca880505bbbd"} err="failed to get container status \"b1df388928454341d2cfda489583d80c64925db6adb630d1962fca880505bbbd\": rpc error: code = NotFound desc = could not find container \"b1df388928454341d2cfda489583d80c64925db6adb630d1962fca880505bbbd\": container with ID starting with b1df388928454341d2cfda489583d80c64925db6adb630d1962fca880505bbbd not found: ID does not exist" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.703852 4804 scope.go:117] "RemoveContainer" containerID="b18e486af5e04ed68c256a5f17c4e5bc48b129f17f0b27df8fa47ac15336cfef" Feb 17 13:31:46 crc kubenswrapper[4804]: E0217 13:31:46.706587 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b18e486af5e04ed68c256a5f17c4e5bc48b129f17f0b27df8fa47ac15336cfef\": container with ID starting with b18e486af5e04ed68c256a5f17c4e5bc48b129f17f0b27df8fa47ac15336cfef not found: ID does not exist" containerID="b18e486af5e04ed68c256a5f17c4e5bc48b129f17f0b27df8fa47ac15336cfef" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.706623 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b18e486af5e04ed68c256a5f17c4e5bc48b129f17f0b27df8fa47ac15336cfef"} err="failed to get container status \"b18e486af5e04ed68c256a5f17c4e5bc48b129f17f0b27df8fa47ac15336cfef\": rpc error: code = NotFound desc = could not find container \"b18e486af5e04ed68c256a5f17c4e5bc48b129f17f0b27df8fa47ac15336cfef\": container with ID starting with b18e486af5e04ed68c256a5f17c4e5bc48b129f17f0b27df8fa47ac15336cfef not found: ID does not exist" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.706830 4804 scope.go:117] "RemoveContainer" containerID="631e1502ce58a957ecc7f80d4be00f1ff4ee5647304073d92dd226e1e956e02c" Feb 17 13:31:46 crc kubenswrapper[4804]: E0217 13:31:46.708577 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"631e1502ce58a957ecc7f80d4be00f1ff4ee5647304073d92dd226e1e956e02c\": container with ID starting with 631e1502ce58a957ecc7f80d4be00f1ff4ee5647304073d92dd226e1e956e02c not found: ID does not exist" containerID="631e1502ce58a957ecc7f80d4be00f1ff4ee5647304073d92dd226e1e956e02c" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.708748 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"631e1502ce58a957ecc7f80d4be00f1ff4ee5647304073d92dd226e1e956e02c"} err="failed to get container status \"631e1502ce58a957ecc7f80d4be00f1ff4ee5647304073d92dd226e1e956e02c\": rpc error: code = NotFound desc = could not find container \"631e1502ce58a957ecc7f80d4be00f1ff4ee5647304073d92dd226e1e956e02c\": container with ID starting with 631e1502ce58a957ecc7f80d4be00f1ff4ee5647304073d92dd226e1e956e02c not found: ID does not exist" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.708903 4804 scope.go:117] "RemoveContainer" containerID="249b979edb84b2aac9f6e8c09057b08c5a469e66d87734e7e060d0b2622d365d" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.713325 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-26cwx" podStartSLOduration=1.7133047860000001 podStartE2EDuration="1.713304786s" podCreationTimestamp="2026-02-17 13:31:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:31:46.705383442 +0000 UTC m=+380.816802809" watchObservedRunningTime="2026-02-17 13:31:46.713304786 +0000 UTC m=+380.824724133" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.721049 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fvtl6"] Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.731231 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.731264 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmjlm\" (UniqueName: \"kubernetes.io/projected/5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df-kube-api-access-rmjlm\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.735325 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fvtl6"] Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.740123 4804 scope.go:117] "RemoveContainer" containerID="249b979edb84b2aac9f6e8c09057b08c5a469e66d87734e7e060d0b2622d365d" Feb 17 13:31:46 crc kubenswrapper[4804]: E0217 13:31:46.740806 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"249b979edb84b2aac9f6e8c09057b08c5a469e66d87734e7e060d0b2622d365d\": container with ID starting with 249b979edb84b2aac9f6e8c09057b08c5a469e66d87734e7e060d0b2622d365d not found: ID does not exist" containerID="249b979edb84b2aac9f6e8c09057b08c5a469e66d87734e7e060d0b2622d365d" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.740833 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"249b979edb84b2aac9f6e8c09057b08c5a469e66d87734e7e060d0b2622d365d"} err="failed to get container status \"249b979edb84b2aac9f6e8c09057b08c5a469e66d87734e7e060d0b2622d365d\": rpc error: code = NotFound desc = could not find container \"249b979edb84b2aac9f6e8c09057b08c5a469e66d87734e7e060d0b2622d365d\": container with ID starting with 249b979edb84b2aac9f6e8c09057b08c5a469e66d87734e7e060d0b2622d365d not found: ID does not exist" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.740858 4804 scope.go:117] "RemoveContainer" containerID="9895e70116d773812fe7c81811ca37d9cf3877a5e4c2894683701b841e72852b" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.746364 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6k2g8"] Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.754193 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df" (UID: "5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.757809 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6k2g8"] Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.761117 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hpw7w"] Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.766647 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hpw7w"] Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.771854 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xf58f"] Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.772935 4804 scope.go:117] "RemoveContainer" containerID="75097ed47129979b31be1fb063b6f429a2375e95db196451ba1b30153d8fc201" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.779112 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xf58f"] Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.791109 4804 scope.go:117] "RemoveContainer" containerID="eafbbbbbea590f0ceb80b9fa9d1c65902281f33937ef810444499dad62ff97c5" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.810491 4804 scope.go:117] "RemoveContainer" containerID="9895e70116d773812fe7c81811ca37d9cf3877a5e4c2894683701b841e72852b" Feb 17 13:31:46 crc kubenswrapper[4804]: E0217 13:31:46.810859 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9895e70116d773812fe7c81811ca37d9cf3877a5e4c2894683701b841e72852b\": container with ID starting with 9895e70116d773812fe7c81811ca37d9cf3877a5e4c2894683701b841e72852b not found: ID does not exist" containerID="9895e70116d773812fe7c81811ca37d9cf3877a5e4c2894683701b841e72852b" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.810912 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9895e70116d773812fe7c81811ca37d9cf3877a5e4c2894683701b841e72852b"} err="failed to get container status \"9895e70116d773812fe7c81811ca37d9cf3877a5e4c2894683701b841e72852b\": rpc error: code = NotFound desc = could not find container \"9895e70116d773812fe7c81811ca37d9cf3877a5e4c2894683701b841e72852b\": container with ID starting with 9895e70116d773812fe7c81811ca37d9cf3877a5e4c2894683701b841e72852b not found: ID does not exist" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.810943 4804 scope.go:117] "RemoveContainer" containerID="75097ed47129979b31be1fb063b6f429a2375e95db196451ba1b30153d8fc201" Feb 17 13:31:46 crc kubenswrapper[4804]: E0217 13:31:46.811558 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75097ed47129979b31be1fb063b6f429a2375e95db196451ba1b30153d8fc201\": container with ID starting with 75097ed47129979b31be1fb063b6f429a2375e95db196451ba1b30153d8fc201 not found: ID does not exist" containerID="75097ed47129979b31be1fb063b6f429a2375e95db196451ba1b30153d8fc201" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.811584 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75097ed47129979b31be1fb063b6f429a2375e95db196451ba1b30153d8fc201"} err="failed to get container status \"75097ed47129979b31be1fb063b6f429a2375e95db196451ba1b30153d8fc201\": rpc error: code = NotFound desc = could not find container \"75097ed47129979b31be1fb063b6f429a2375e95db196451ba1b30153d8fc201\": container with ID starting with 75097ed47129979b31be1fb063b6f429a2375e95db196451ba1b30153d8fc201 not found: ID does not exist" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.811600 4804 scope.go:117] "RemoveContainer" containerID="eafbbbbbea590f0ceb80b9fa9d1c65902281f33937ef810444499dad62ff97c5" Feb 17 13:31:46 crc kubenswrapper[4804]: E0217 13:31:46.812360 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eafbbbbbea590f0ceb80b9fa9d1c65902281f33937ef810444499dad62ff97c5\": container with ID starting with eafbbbbbea590f0ceb80b9fa9d1c65902281f33937ef810444499dad62ff97c5 not found: ID does not exist" containerID="eafbbbbbea590f0ceb80b9fa9d1c65902281f33937ef810444499dad62ff97c5" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.812386 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eafbbbbbea590f0ceb80b9fa9d1c65902281f33937ef810444499dad62ff97c5"} err="failed to get container status \"eafbbbbbea590f0ceb80b9fa9d1c65902281f33937ef810444499dad62ff97c5\": rpc error: code = NotFound desc = could not find container \"eafbbbbbea590f0ceb80b9fa9d1c65902281f33937ef810444499dad62ff97c5\": container with ID starting with eafbbbbbea590f0ceb80b9fa9d1c65902281f33937ef810444499dad62ff97c5 not found: ID does not exist" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.812399 4804 scope.go:117] "RemoveContainer" containerID="6f5ffb12c26b0fccbe16222c85dcb4bbe8bbf2181d09e1a07e43c582de974240" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.829849 4804 scope.go:117] "RemoveContainer" containerID="de147f3e2e549f3082d93cbb71b29ca9b9c52bb022b8b09a8bc2c4b12089d55e" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.832841 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.848400 4804 scope.go:117] "RemoveContainer" containerID="f15bcf4d827a1a1b8fffb482f5c62a175b13f562fa7361a7f56418b0dbd2dd76" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.864473 4804 scope.go:117] "RemoveContainer" containerID="6f5ffb12c26b0fccbe16222c85dcb4bbe8bbf2181d09e1a07e43c582de974240" Feb 17 13:31:46 crc kubenswrapper[4804]: E0217 13:31:46.864877 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f5ffb12c26b0fccbe16222c85dcb4bbe8bbf2181d09e1a07e43c582de974240\": container with ID starting with 6f5ffb12c26b0fccbe16222c85dcb4bbe8bbf2181d09e1a07e43c582de974240 not found: ID does not exist" containerID="6f5ffb12c26b0fccbe16222c85dcb4bbe8bbf2181d09e1a07e43c582de974240" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.864942 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f5ffb12c26b0fccbe16222c85dcb4bbe8bbf2181d09e1a07e43c582de974240"} err="failed to get container status \"6f5ffb12c26b0fccbe16222c85dcb4bbe8bbf2181d09e1a07e43c582de974240\": rpc error: code = NotFound desc = could not find container \"6f5ffb12c26b0fccbe16222c85dcb4bbe8bbf2181d09e1a07e43c582de974240\": container with ID starting with 6f5ffb12c26b0fccbe16222c85dcb4bbe8bbf2181d09e1a07e43c582de974240 not found: ID does not exist" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.864987 4804 scope.go:117] "RemoveContainer" containerID="de147f3e2e549f3082d93cbb71b29ca9b9c52bb022b8b09a8bc2c4b12089d55e" Feb 17 13:31:46 crc kubenswrapper[4804]: E0217 13:31:46.865428 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de147f3e2e549f3082d93cbb71b29ca9b9c52bb022b8b09a8bc2c4b12089d55e\": container with ID starting with de147f3e2e549f3082d93cbb71b29ca9b9c52bb022b8b09a8bc2c4b12089d55e not found: ID does not exist" containerID="de147f3e2e549f3082d93cbb71b29ca9b9c52bb022b8b09a8bc2c4b12089d55e" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.865458 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de147f3e2e549f3082d93cbb71b29ca9b9c52bb022b8b09a8bc2c4b12089d55e"} err="failed to get container status \"de147f3e2e549f3082d93cbb71b29ca9b9c52bb022b8b09a8bc2c4b12089d55e\": rpc error: code = NotFound desc = could not find container \"de147f3e2e549f3082d93cbb71b29ca9b9c52bb022b8b09a8bc2c4b12089d55e\": container with ID starting with de147f3e2e549f3082d93cbb71b29ca9b9c52bb022b8b09a8bc2c4b12089d55e not found: ID does not exist" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.865482 4804 scope.go:117] "RemoveContainer" containerID="f15bcf4d827a1a1b8fffb482f5c62a175b13f562fa7361a7f56418b0dbd2dd76" Feb 17 13:31:46 crc kubenswrapper[4804]: E0217 13:31:46.865796 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f15bcf4d827a1a1b8fffb482f5c62a175b13f562fa7361a7f56418b0dbd2dd76\": container with ID starting with f15bcf4d827a1a1b8fffb482f5c62a175b13f562fa7361a7f56418b0dbd2dd76 not found: ID does not exist" containerID="f15bcf4d827a1a1b8fffb482f5c62a175b13f562fa7361a7f56418b0dbd2dd76" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.865814 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f15bcf4d827a1a1b8fffb482f5c62a175b13f562fa7361a7f56418b0dbd2dd76"} err="failed to get container status \"f15bcf4d827a1a1b8fffb482f5c62a175b13f562fa7361a7f56418b0dbd2dd76\": rpc error: code = NotFound desc = could not find container \"f15bcf4d827a1a1b8fffb482f5c62a175b13f562fa7361a7f56418b0dbd2dd76\": container with ID starting with f15bcf4d827a1a1b8fffb482f5c62a175b13f562fa7361a7f56418b0dbd2dd76 not found: ID does not exist" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.865827 4804 scope.go:117] "RemoveContainer" containerID="0d6529764fb52a660f73a1c0bf0bbd57d05ce1c0989e42faa62fceb8a8e28c77" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.878218 4804 scope.go:117] "RemoveContainer" containerID="3c6feed8e302b106af248336d8cbbdff9d744222eb1b31aea9ea024adc236f19" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.892979 4804 scope.go:117] "RemoveContainer" containerID="88fbb69755113b5be170561fa4a81a646f0de4078219f8a5bd52d347e075ff44" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.910947 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-54w49"] Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.912729 4804 scope.go:117] "RemoveContainer" containerID="0d6529764fb52a660f73a1c0bf0bbd57d05ce1c0989e42faa62fceb8a8e28c77" Feb 17 13:31:46 crc kubenswrapper[4804]: E0217 13:31:46.913282 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d6529764fb52a660f73a1c0bf0bbd57d05ce1c0989e42faa62fceb8a8e28c77\": container with ID starting with 0d6529764fb52a660f73a1c0bf0bbd57d05ce1c0989e42faa62fceb8a8e28c77 not found: ID does not exist" containerID="0d6529764fb52a660f73a1c0bf0bbd57d05ce1c0989e42faa62fceb8a8e28c77" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.913333 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d6529764fb52a660f73a1c0bf0bbd57d05ce1c0989e42faa62fceb8a8e28c77"} err="failed to get container status \"0d6529764fb52a660f73a1c0bf0bbd57d05ce1c0989e42faa62fceb8a8e28c77\": rpc error: code = NotFound desc = could not find container \"0d6529764fb52a660f73a1c0bf0bbd57d05ce1c0989e42faa62fceb8a8e28c77\": container with ID starting with 0d6529764fb52a660f73a1c0bf0bbd57d05ce1c0989e42faa62fceb8a8e28c77 not found: ID does not exist" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.913367 4804 scope.go:117] "RemoveContainer" containerID="3c6feed8e302b106af248336d8cbbdff9d744222eb1b31aea9ea024adc236f19" Feb 17 13:31:46 crc kubenswrapper[4804]: E0217 13:31:46.913657 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c6feed8e302b106af248336d8cbbdff9d744222eb1b31aea9ea024adc236f19\": container with ID starting with 3c6feed8e302b106af248336d8cbbdff9d744222eb1b31aea9ea024adc236f19 not found: ID does not exist" containerID="3c6feed8e302b106af248336d8cbbdff9d744222eb1b31aea9ea024adc236f19" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.913691 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c6feed8e302b106af248336d8cbbdff9d744222eb1b31aea9ea024adc236f19"} err="failed to get container status \"3c6feed8e302b106af248336d8cbbdff9d744222eb1b31aea9ea024adc236f19\": rpc error: code = NotFound desc = could not find container \"3c6feed8e302b106af248336d8cbbdff9d744222eb1b31aea9ea024adc236f19\": container with ID starting with 3c6feed8e302b106af248336d8cbbdff9d744222eb1b31aea9ea024adc236f19 not found: ID does not exist" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.913715 4804 scope.go:117] "RemoveContainer" containerID="88fbb69755113b5be170561fa4a81a646f0de4078219f8a5bd52d347e075ff44" Feb 17 13:31:46 crc kubenswrapper[4804]: E0217 13:31:46.914063 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88fbb69755113b5be170561fa4a81a646f0de4078219f8a5bd52d347e075ff44\": container with ID starting with 88fbb69755113b5be170561fa4a81a646f0de4078219f8a5bd52d347e075ff44 not found: ID does not exist" containerID="88fbb69755113b5be170561fa4a81a646f0de4078219f8a5bd52d347e075ff44" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.914090 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88fbb69755113b5be170561fa4a81a646f0de4078219f8a5bd52d347e075ff44"} err="failed to get container status \"88fbb69755113b5be170561fa4a81a646f0de4078219f8a5bd52d347e075ff44\": rpc error: code = NotFound desc = could not find container \"88fbb69755113b5be170561fa4a81a646f0de4078219f8a5bd52d347e075ff44\": container with ID starting with 88fbb69755113b5be170561fa4a81a646f0de4078219f8a5bd52d347e075ff44 not found: ID does not exist" Feb 17 13:31:46 crc kubenswrapper[4804]: I0217 13:31:46.914420 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-54w49"] Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.612501 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-26cwx" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.649779 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5fs82"] Feb 17 13:31:47 crc kubenswrapper[4804]: E0217 13:31:47.650003 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5" containerName="extract-utilities" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.650018 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5" containerName="extract-utilities" Feb 17 13:31:47 crc kubenswrapper[4804]: E0217 13:31:47.650031 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbda9f29-b199-4a42-8757-f5ecc90f0437" containerName="extract-content" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.650038 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbda9f29-b199-4a42-8757-f5ecc90f0437" containerName="extract-content" Feb 17 13:31:47 crc kubenswrapper[4804]: E0217 13:31:47.650047 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a10f4e7-7906-43aa-98fb-e709a71a55d2" containerName="registry-server" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.650054 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a10f4e7-7906-43aa-98fb-e709a71a55d2" containerName="registry-server" Feb 17 13:31:47 crc kubenswrapper[4804]: E0217 13:31:47.650063 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbda9f29-b199-4a42-8757-f5ecc90f0437" containerName="registry-server" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.650070 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbda9f29-b199-4a42-8757-f5ecc90f0437" containerName="registry-server" Feb 17 13:31:47 crc kubenswrapper[4804]: E0217 13:31:47.650081 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df" containerName="extract-content" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.650088 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df" containerName="extract-content" Feb 17 13:31:47 crc kubenswrapper[4804]: E0217 13:31:47.650099 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a10f4e7-7906-43aa-98fb-e709a71a55d2" containerName="extract-utilities" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.650107 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a10f4e7-7906-43aa-98fb-e709a71a55d2" containerName="extract-utilities" Feb 17 13:31:47 crc kubenswrapper[4804]: E0217 13:31:47.650122 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbda9f29-b199-4a42-8757-f5ecc90f0437" containerName="extract-utilities" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.650130 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbda9f29-b199-4a42-8757-f5ecc90f0437" containerName="extract-utilities" Feb 17 13:31:47 crc kubenswrapper[4804]: E0217 13:31:47.650139 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5" containerName="extract-content" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.650148 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5" containerName="extract-content" Feb 17 13:31:47 crc kubenswrapper[4804]: E0217 13:31:47.650159 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a10f4e7-7906-43aa-98fb-e709a71a55d2" containerName="extract-content" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.650167 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a10f4e7-7906-43aa-98fb-e709a71a55d2" containerName="extract-content" Feb 17 13:31:47 crc kubenswrapper[4804]: E0217 13:31:47.650179 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df" containerName="extract-utilities" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.650186 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df" containerName="extract-utilities" Feb 17 13:31:47 crc kubenswrapper[4804]: E0217 13:31:47.650214 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ce6eded-da13-4bb7-a87d-71b87d0e7f06" containerName="marketplace-operator" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.650224 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ce6eded-da13-4bb7-a87d-71b87d0e7f06" containerName="marketplace-operator" Feb 17 13:31:47 crc kubenswrapper[4804]: E0217 13:31:47.650235 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df" containerName="registry-server" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.650242 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df" containerName="registry-server" Feb 17 13:31:47 crc kubenswrapper[4804]: E0217 13:31:47.650254 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5" containerName="registry-server" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.650262 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5" containerName="registry-server" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.650402 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df" containerName="registry-server" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.650417 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbda9f29-b199-4a42-8757-f5ecc90f0437" containerName="registry-server" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.650429 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a10f4e7-7906-43aa-98fb-e709a71a55d2" containerName="registry-server" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.650443 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5" containerName="registry-server" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.650453 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ce6eded-da13-4bb7-a87d-71b87d0e7f06" containerName="marketplace-operator" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.651308 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5fs82" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.654912 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.667270 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5fs82"] Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.749179 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrkg9\" (UniqueName: \"kubernetes.io/projected/e7d80260-64fd-4975-a620-5c515a765fd3-kube-api-access-wrkg9\") pod \"redhat-marketplace-5fs82\" (UID: \"e7d80260-64fd-4975-a620-5c515a765fd3\") " pod="openshift-marketplace/redhat-marketplace-5fs82" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.749306 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7d80260-64fd-4975-a620-5c515a765fd3-utilities\") pod \"redhat-marketplace-5fs82\" (UID: \"e7d80260-64fd-4975-a620-5c515a765fd3\") " pod="openshift-marketplace/redhat-marketplace-5fs82" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.749363 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7d80260-64fd-4975-a620-5c515a765fd3-catalog-content\") pod \"redhat-marketplace-5fs82\" (UID: \"e7d80260-64fd-4975-a620-5c515a765fd3\") " pod="openshift-marketplace/redhat-marketplace-5fs82" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.851149 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrkg9\" (UniqueName: \"kubernetes.io/projected/e7d80260-64fd-4975-a620-5c515a765fd3-kube-api-access-wrkg9\") pod \"redhat-marketplace-5fs82\" (UID: \"e7d80260-64fd-4975-a620-5c515a765fd3\") " pod="openshift-marketplace/redhat-marketplace-5fs82" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.851229 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7d80260-64fd-4975-a620-5c515a765fd3-utilities\") pod \"redhat-marketplace-5fs82\" (UID: \"e7d80260-64fd-4975-a620-5c515a765fd3\") " pod="openshift-marketplace/redhat-marketplace-5fs82" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.851263 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7d80260-64fd-4975-a620-5c515a765fd3-catalog-content\") pod \"redhat-marketplace-5fs82\" (UID: \"e7d80260-64fd-4975-a620-5c515a765fd3\") " pod="openshift-marketplace/redhat-marketplace-5fs82" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.851747 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7d80260-64fd-4975-a620-5c515a765fd3-catalog-content\") pod \"redhat-marketplace-5fs82\" (UID: \"e7d80260-64fd-4975-a620-5c515a765fd3\") " pod="openshift-marketplace/redhat-marketplace-5fs82" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.851833 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7d80260-64fd-4975-a620-5c515a765fd3-utilities\") pod \"redhat-marketplace-5fs82\" (UID: \"e7d80260-64fd-4975-a620-5c515a765fd3\") " pod="openshift-marketplace/redhat-marketplace-5fs82" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.870466 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrkg9\" (UniqueName: \"kubernetes.io/projected/e7d80260-64fd-4975-a620-5c515a765fd3-kube-api-access-wrkg9\") pod \"redhat-marketplace-5fs82\" (UID: \"e7d80260-64fd-4975-a620-5c515a765fd3\") " pod="openshift-marketplace/redhat-marketplace-5fs82" Feb 17 13:31:47 crc kubenswrapper[4804]: I0217 13:31:47.984488 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5fs82" Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.229807 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bhcxz"] Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.231064 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bhcxz" Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.233359 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.243436 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bhcxz"] Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.356430 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7rn4\" (UniqueName: \"kubernetes.io/projected/fdf90149-055d-48ca-9336-ca6d6545f8a3-kube-api-access-l7rn4\") pod \"redhat-operators-bhcxz\" (UID: \"fdf90149-055d-48ca-9336-ca6d6545f8a3\") " pod="openshift-marketplace/redhat-operators-bhcxz" Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.356469 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdf90149-055d-48ca-9336-ca6d6545f8a3-utilities\") pod \"redhat-operators-bhcxz\" (UID: \"fdf90149-055d-48ca-9336-ca6d6545f8a3\") " pod="openshift-marketplace/redhat-operators-bhcxz" Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.356522 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdf90149-055d-48ca-9336-ca6d6545f8a3-catalog-content\") pod \"redhat-operators-bhcxz\" (UID: \"fdf90149-055d-48ca-9336-ca6d6545f8a3\") " pod="openshift-marketplace/redhat-operators-bhcxz" Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.452062 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5fs82"] Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.457848 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7rn4\" (UniqueName: \"kubernetes.io/projected/fdf90149-055d-48ca-9336-ca6d6545f8a3-kube-api-access-l7rn4\") pod \"redhat-operators-bhcxz\" (UID: \"fdf90149-055d-48ca-9336-ca6d6545f8a3\") " pod="openshift-marketplace/redhat-operators-bhcxz" Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.457884 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdf90149-055d-48ca-9336-ca6d6545f8a3-utilities\") pod \"redhat-operators-bhcxz\" (UID: \"fdf90149-055d-48ca-9336-ca6d6545f8a3\") " pod="openshift-marketplace/redhat-operators-bhcxz" Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.458061 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdf90149-055d-48ca-9336-ca6d6545f8a3-catalog-content\") pod \"redhat-operators-bhcxz\" (UID: \"fdf90149-055d-48ca-9336-ca6d6545f8a3\") " pod="openshift-marketplace/redhat-operators-bhcxz" Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.458457 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdf90149-055d-48ca-9336-ca6d6545f8a3-catalog-content\") pod \"redhat-operators-bhcxz\" (UID: \"fdf90149-055d-48ca-9336-ca6d6545f8a3\") " pod="openshift-marketplace/redhat-operators-bhcxz" Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.458671 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdf90149-055d-48ca-9336-ca6d6545f8a3-utilities\") pod \"redhat-operators-bhcxz\" (UID: \"fdf90149-055d-48ca-9336-ca6d6545f8a3\") " pod="openshift-marketplace/redhat-operators-bhcxz" Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.476428 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7rn4\" (UniqueName: \"kubernetes.io/projected/fdf90149-055d-48ca-9336-ca6d6545f8a3-kube-api-access-l7rn4\") pod \"redhat-operators-bhcxz\" (UID: \"fdf90149-055d-48ca-9336-ca6d6545f8a3\") " pod="openshift-marketplace/redhat-operators-bhcxz" Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.556316 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bhcxz" Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.579706 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ce6eded-da13-4bb7-a87d-71b87d0e7f06" path="/var/lib/kubelet/pods/2ce6eded-da13-4bb7-a87d-71b87d0e7f06/volumes" Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.580501 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5" path="/var/lib/kubelet/pods/4dbfd9db-3d17-44af-ab32-d2f7e7a1fab5/volumes" Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.581310 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df" path="/var/lib/kubelet/pods/5612e7d5-a40f-4207-8e6d-3e4bb9bdc0df/volumes" Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.582712 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a10f4e7-7906-43aa-98fb-e709a71a55d2" path="/var/lib/kubelet/pods/6a10f4e7-7906-43aa-98fb-e709a71a55d2/volumes" Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.583606 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbda9f29-b199-4a42-8757-f5ecc90f0437" path="/var/lib/kubelet/pods/cbda9f29-b199-4a42-8757-f5ecc90f0437/volumes" Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.617582 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5fs82" event={"ID":"e7d80260-64fd-4975-a620-5c515a765fd3","Type":"ContainerStarted","Data":"4cc6d7c51e418b43a501474af2e9b9b60e06a16040f7d822d8e1b2cea5711db9"} Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.617642 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5fs82" event={"ID":"e7d80260-64fd-4975-a620-5c515a765fd3","Type":"ContainerStarted","Data":"aab8a93b42e209f9c0896ccbc83840d676b303447174bf2e5277b7db9ef5ce9c"} Feb 17 13:31:48 crc kubenswrapper[4804]: I0217 13:31:48.771317 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bhcxz"] Feb 17 13:31:48 crc kubenswrapper[4804]: W0217 13:31:48.777208 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdf90149_055d_48ca_9336_ca6d6545f8a3.slice/crio-b36f1a9d3bb9bce0d65ed7aea0a70bcace69dc6992d2df01c00c6c4e740bc208 WatchSource:0}: Error finding container b36f1a9d3bb9bce0d65ed7aea0a70bcace69dc6992d2df01c00c6c4e740bc208: Status 404 returned error can't find the container with id b36f1a9d3bb9bce0d65ed7aea0a70bcace69dc6992d2df01c00c6c4e740bc208 Feb 17 13:31:49 crc kubenswrapper[4804]: I0217 13:31:49.629718 4804 generic.go:334] "Generic (PLEG): container finished" podID="e7d80260-64fd-4975-a620-5c515a765fd3" containerID="4cc6d7c51e418b43a501474af2e9b9b60e06a16040f7d822d8e1b2cea5711db9" exitCode=0 Feb 17 13:31:49 crc kubenswrapper[4804]: I0217 13:31:49.629782 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5fs82" event={"ID":"e7d80260-64fd-4975-a620-5c515a765fd3","Type":"ContainerDied","Data":"4cc6d7c51e418b43a501474af2e9b9b60e06a16040f7d822d8e1b2cea5711db9"} Feb 17 13:31:49 crc kubenswrapper[4804]: I0217 13:31:49.631437 4804 generic.go:334] "Generic (PLEG): container finished" podID="fdf90149-055d-48ca-9336-ca6d6545f8a3" containerID="aec9aafaeb0231fd50b93156ef23ec8d4f34ac9ec3ae7c91631e24543663c093" exitCode=0 Feb 17 13:31:49 crc kubenswrapper[4804]: I0217 13:31:49.631464 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bhcxz" event={"ID":"fdf90149-055d-48ca-9336-ca6d6545f8a3","Type":"ContainerDied","Data":"aec9aafaeb0231fd50b93156ef23ec8d4f34ac9ec3ae7c91631e24543663c093"} Feb 17 13:31:49 crc kubenswrapper[4804]: I0217 13:31:49.631488 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bhcxz" event={"ID":"fdf90149-055d-48ca-9336-ca6d6545f8a3","Type":"ContainerStarted","Data":"b36f1a9d3bb9bce0d65ed7aea0a70bcace69dc6992d2df01c00c6c4e740bc208"} Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.027576 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jhxhx"] Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.029594 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jhxhx" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.031849 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.037668 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jhxhx"] Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.077367 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt9p2\" (UniqueName: \"kubernetes.io/projected/5816c991-ba5a-4d3c-9d69-d28846ca92f6-kube-api-access-zt9p2\") pod \"certified-operators-jhxhx\" (UID: \"5816c991-ba5a-4d3c-9d69-d28846ca92f6\") " pod="openshift-marketplace/certified-operators-jhxhx" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.077425 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5816c991-ba5a-4d3c-9d69-d28846ca92f6-catalog-content\") pod \"certified-operators-jhxhx\" (UID: \"5816c991-ba5a-4d3c-9d69-d28846ca92f6\") " pod="openshift-marketplace/certified-operators-jhxhx" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.077478 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5816c991-ba5a-4d3c-9d69-d28846ca92f6-utilities\") pod \"certified-operators-jhxhx\" (UID: \"5816c991-ba5a-4d3c-9d69-d28846ca92f6\") " pod="openshift-marketplace/certified-operators-jhxhx" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.179035 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5816c991-ba5a-4d3c-9d69-d28846ca92f6-utilities\") pod \"certified-operators-jhxhx\" (UID: \"5816c991-ba5a-4d3c-9d69-d28846ca92f6\") " pod="openshift-marketplace/certified-operators-jhxhx" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.179119 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt9p2\" (UniqueName: \"kubernetes.io/projected/5816c991-ba5a-4d3c-9d69-d28846ca92f6-kube-api-access-zt9p2\") pod \"certified-operators-jhxhx\" (UID: \"5816c991-ba5a-4d3c-9d69-d28846ca92f6\") " pod="openshift-marketplace/certified-operators-jhxhx" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.179148 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5816c991-ba5a-4d3c-9d69-d28846ca92f6-catalog-content\") pod \"certified-operators-jhxhx\" (UID: \"5816c991-ba5a-4d3c-9d69-d28846ca92f6\") " pod="openshift-marketplace/certified-operators-jhxhx" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.179587 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5816c991-ba5a-4d3c-9d69-d28846ca92f6-catalog-content\") pod \"certified-operators-jhxhx\" (UID: \"5816c991-ba5a-4d3c-9d69-d28846ca92f6\") " pod="openshift-marketplace/certified-operators-jhxhx" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.181225 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5816c991-ba5a-4d3c-9d69-d28846ca92f6-utilities\") pod \"certified-operators-jhxhx\" (UID: \"5816c991-ba5a-4d3c-9d69-d28846ca92f6\") " pod="openshift-marketplace/certified-operators-jhxhx" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.201863 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt9p2\" (UniqueName: \"kubernetes.io/projected/5816c991-ba5a-4d3c-9d69-d28846ca92f6-kube-api-access-zt9p2\") pod \"certified-operators-jhxhx\" (UID: \"5816c991-ba5a-4d3c-9d69-d28846ca92f6\") " pod="openshift-marketplace/certified-operators-jhxhx" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.349186 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jhxhx" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.628987 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m2bjw"] Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.633985 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m2bjw" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.638267 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.640087 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m2bjw"] Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.652938 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bhcxz" event={"ID":"fdf90149-055d-48ca-9336-ca6d6545f8a3","Type":"ContainerStarted","Data":"655e7850618eb7f1a6d3ae03ba1313c40721cf71550535d385a4aa123058d615"} Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.657489 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5fs82" event={"ID":"e7d80260-64fd-4975-a620-5c515a765fd3","Type":"ContainerStarted","Data":"fa2f3974c7128503ab67f47b8f0f2c135f4217d52547f1e2d0231f564911984b"} Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.684817 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57d3429b-b2f5-49ea-94b2-b79aa1769367-catalog-content\") pod \"community-operators-m2bjw\" (UID: \"57d3429b-b2f5-49ea-94b2-b79aa1769367\") " pod="openshift-marketplace/community-operators-m2bjw" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.685036 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57d3429b-b2f5-49ea-94b2-b79aa1769367-utilities\") pod \"community-operators-m2bjw\" (UID: \"57d3429b-b2f5-49ea-94b2-b79aa1769367\") " pod="openshift-marketplace/community-operators-m2bjw" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.685281 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k8vn\" (UniqueName: \"kubernetes.io/projected/57d3429b-b2f5-49ea-94b2-b79aa1769367-kube-api-access-5k8vn\") pod \"community-operators-m2bjw\" (UID: \"57d3429b-b2f5-49ea-94b2-b79aa1769367\") " pod="openshift-marketplace/community-operators-m2bjw" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.771155 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jhxhx"] Feb 17 13:31:50 crc kubenswrapper[4804]: W0217 13:31:50.774353 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5816c991_ba5a_4d3c_9d69_d28846ca92f6.slice/crio-44dc911d44a2a4a6821779b374fb03d80db8333ee5f639859058e7be18d2596c WatchSource:0}: Error finding container 44dc911d44a2a4a6821779b374fb03d80db8333ee5f639859058e7be18d2596c: Status 404 returned error can't find the container with id 44dc911d44a2a4a6821779b374fb03d80db8333ee5f639859058e7be18d2596c Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.786336 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k8vn\" (UniqueName: \"kubernetes.io/projected/57d3429b-b2f5-49ea-94b2-b79aa1769367-kube-api-access-5k8vn\") pod \"community-operators-m2bjw\" (UID: \"57d3429b-b2f5-49ea-94b2-b79aa1769367\") " pod="openshift-marketplace/community-operators-m2bjw" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.786622 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57d3429b-b2f5-49ea-94b2-b79aa1769367-catalog-content\") pod \"community-operators-m2bjw\" (UID: \"57d3429b-b2f5-49ea-94b2-b79aa1769367\") " pod="openshift-marketplace/community-operators-m2bjw" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.786694 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57d3429b-b2f5-49ea-94b2-b79aa1769367-utilities\") pod \"community-operators-m2bjw\" (UID: \"57d3429b-b2f5-49ea-94b2-b79aa1769367\") " pod="openshift-marketplace/community-operators-m2bjw" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.787253 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57d3429b-b2f5-49ea-94b2-b79aa1769367-utilities\") pod \"community-operators-m2bjw\" (UID: \"57d3429b-b2f5-49ea-94b2-b79aa1769367\") " pod="openshift-marketplace/community-operators-m2bjw" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.787552 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57d3429b-b2f5-49ea-94b2-b79aa1769367-catalog-content\") pod \"community-operators-m2bjw\" (UID: \"57d3429b-b2f5-49ea-94b2-b79aa1769367\") " pod="openshift-marketplace/community-operators-m2bjw" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.806277 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k8vn\" (UniqueName: \"kubernetes.io/projected/57d3429b-b2f5-49ea-94b2-b79aa1769367-kube-api-access-5k8vn\") pod \"community-operators-m2bjw\" (UID: \"57d3429b-b2f5-49ea-94b2-b79aa1769367\") " pod="openshift-marketplace/community-operators-m2bjw" Feb 17 13:31:50 crc kubenswrapper[4804]: I0217 13:31:50.966240 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m2bjw" Feb 17 13:31:51 crc kubenswrapper[4804]: I0217 13:31:51.355425 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m2bjw"] Feb 17 13:31:51 crc kubenswrapper[4804]: I0217 13:31:51.664348 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m2bjw" event={"ID":"57d3429b-b2f5-49ea-94b2-b79aa1769367","Type":"ContainerStarted","Data":"bfdfc2d7fc6905547354dfb774070a83182c87e28316ab2f18ad07677a3e9bbb"} Feb 17 13:31:51 crc kubenswrapper[4804]: I0217 13:31:51.665928 4804 generic.go:334] "Generic (PLEG): container finished" podID="e7d80260-64fd-4975-a620-5c515a765fd3" containerID="fa2f3974c7128503ab67f47b8f0f2c135f4217d52547f1e2d0231f564911984b" exitCode=0 Feb 17 13:31:51 crc kubenswrapper[4804]: I0217 13:31:51.665993 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5fs82" event={"ID":"e7d80260-64fd-4975-a620-5c515a765fd3","Type":"ContainerDied","Data":"fa2f3974c7128503ab67f47b8f0f2c135f4217d52547f1e2d0231f564911984b"} Feb 17 13:31:51 crc kubenswrapper[4804]: I0217 13:31:51.674043 4804 generic.go:334] "Generic (PLEG): container finished" podID="5816c991-ba5a-4d3c-9d69-d28846ca92f6" containerID="e367fd4183d97d52042e7b9188c938e6f12d4820fbce7a04c1773ea2248fb662" exitCode=0 Feb 17 13:31:51 crc kubenswrapper[4804]: I0217 13:31:51.674101 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jhxhx" event={"ID":"5816c991-ba5a-4d3c-9d69-d28846ca92f6","Type":"ContainerDied","Data":"e367fd4183d97d52042e7b9188c938e6f12d4820fbce7a04c1773ea2248fb662"} Feb 17 13:31:51 crc kubenswrapper[4804]: I0217 13:31:51.674130 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jhxhx" event={"ID":"5816c991-ba5a-4d3c-9d69-d28846ca92f6","Type":"ContainerStarted","Data":"44dc911d44a2a4a6821779b374fb03d80db8333ee5f639859058e7be18d2596c"} Feb 17 13:31:51 crc kubenswrapper[4804]: I0217 13:31:51.677085 4804 generic.go:334] "Generic (PLEG): container finished" podID="fdf90149-055d-48ca-9336-ca6d6545f8a3" containerID="655e7850618eb7f1a6d3ae03ba1313c40721cf71550535d385a4aa123058d615" exitCode=0 Feb 17 13:31:51 crc kubenswrapper[4804]: I0217 13:31:51.677125 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bhcxz" event={"ID":"fdf90149-055d-48ca-9336-ca6d6545f8a3","Type":"ContainerDied","Data":"655e7850618eb7f1a6d3ae03ba1313c40721cf71550535d385a4aa123058d615"} Feb 17 13:31:52 crc kubenswrapper[4804]: I0217 13:31:52.683182 4804 generic.go:334] "Generic (PLEG): container finished" podID="57d3429b-b2f5-49ea-94b2-b79aa1769367" containerID="f920816953a1e71425cd0949e078b20754c9607ca1084d5d38622e84385b81f6" exitCode=0 Feb 17 13:31:52 crc kubenswrapper[4804]: I0217 13:31:52.683308 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m2bjw" event={"ID":"57d3429b-b2f5-49ea-94b2-b79aa1769367","Type":"ContainerDied","Data":"f920816953a1e71425cd0949e078b20754c9607ca1084d5d38622e84385b81f6"} Feb 17 13:31:53 crc kubenswrapper[4804]: I0217 13:31:53.691757 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5fs82" event={"ID":"e7d80260-64fd-4975-a620-5c515a765fd3","Type":"ContainerStarted","Data":"61a1703a0dbfcc1dcb3006d155fd893895d87f6e58c62c0e9ff3c6f1569d9df3"} Feb 17 13:31:54 crc kubenswrapper[4804]: I0217 13:31:54.698116 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m2bjw" event={"ID":"57d3429b-b2f5-49ea-94b2-b79aa1769367","Type":"ContainerStarted","Data":"bfed1b1fec8bd92d2c322cf6498a26e10ea50d3847ab60bd2d34adae8689a746"} Feb 17 13:31:54 crc kubenswrapper[4804]: I0217 13:31:54.703639 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jhxhx" event={"ID":"5816c991-ba5a-4d3c-9d69-d28846ca92f6","Type":"ContainerStarted","Data":"b9933b0363f7f4e4a5625db8e26a5f4a9a76ce22cf10be62a4bd19f9e6534fbd"} Feb 17 13:31:54 crc kubenswrapper[4804]: I0217 13:31:54.705567 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bhcxz" event={"ID":"fdf90149-055d-48ca-9336-ca6d6545f8a3","Type":"ContainerStarted","Data":"99e2aa4e9ffd4764c886e89b267517bc69e0446a4dde7f269ace85ac34cf8bca"} Feb 17 13:31:54 crc kubenswrapper[4804]: I0217 13:31:54.725055 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5fs82" podStartSLOduration=4.480414735 podStartE2EDuration="7.72503295s" podCreationTimestamp="2026-02-17 13:31:47 +0000 UTC" firstStartedPulling="2026-02-17 13:31:49.631593715 +0000 UTC m=+383.743013052" lastFinishedPulling="2026-02-17 13:31:52.87621191 +0000 UTC m=+386.987631267" observedRunningTime="2026-02-17 13:31:53.711526375 +0000 UTC m=+387.822945712" watchObservedRunningTime="2026-02-17 13:31:54.72503295 +0000 UTC m=+388.836452297" Feb 17 13:31:54 crc kubenswrapper[4804]: I0217 13:31:54.742860 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bhcxz" podStartSLOduration=2.173845965 podStartE2EDuration="6.74284674s" podCreationTimestamp="2026-02-17 13:31:48 +0000 UTC" firstStartedPulling="2026-02-17 13:31:49.633064064 +0000 UTC m=+383.744483401" lastFinishedPulling="2026-02-17 13:31:54.202064839 +0000 UTC m=+388.313484176" observedRunningTime="2026-02-17 13:31:54.741345911 +0000 UTC m=+388.852765248" watchObservedRunningTime="2026-02-17 13:31:54.74284674 +0000 UTC m=+388.854266077" Feb 17 13:31:55 crc kubenswrapper[4804]: I0217 13:31:55.712084 4804 generic.go:334] "Generic (PLEG): container finished" podID="57d3429b-b2f5-49ea-94b2-b79aa1769367" containerID="bfed1b1fec8bd92d2c322cf6498a26e10ea50d3847ab60bd2d34adae8689a746" exitCode=0 Feb 17 13:31:55 crc kubenswrapper[4804]: I0217 13:31:55.712151 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m2bjw" event={"ID":"57d3429b-b2f5-49ea-94b2-b79aa1769367","Type":"ContainerDied","Data":"bfed1b1fec8bd92d2c322cf6498a26e10ea50d3847ab60bd2d34adae8689a746"} Feb 17 13:31:55 crc kubenswrapper[4804]: I0217 13:31:55.714380 4804 generic.go:334] "Generic (PLEG): container finished" podID="5816c991-ba5a-4d3c-9d69-d28846ca92f6" containerID="b9933b0363f7f4e4a5625db8e26a5f4a9a76ce22cf10be62a4bd19f9e6534fbd" exitCode=0 Feb 17 13:31:55 crc kubenswrapper[4804]: I0217 13:31:55.714438 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jhxhx" event={"ID":"5816c991-ba5a-4d3c-9d69-d28846ca92f6","Type":"ContainerDied","Data":"b9933b0363f7f4e4a5625db8e26a5f4a9a76ce22cf10be62a4bd19f9e6534fbd"} Feb 17 13:31:55 crc kubenswrapper[4804]: I0217 13:31:55.835456 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:31:55 crc kubenswrapper[4804]: I0217 13:31:55.835749 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:31:56 crc kubenswrapper[4804]: I0217 13:31:56.721473 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jhxhx" event={"ID":"5816c991-ba5a-4d3c-9d69-d28846ca92f6","Type":"ContainerStarted","Data":"ade1fc4444eea0de1f2fff03c43e9b9a9b528f44acfc95be11d7194dc1810c81"} Feb 17 13:31:56 crc kubenswrapper[4804]: I0217 13:31:56.724706 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m2bjw" event={"ID":"57d3429b-b2f5-49ea-94b2-b79aa1769367","Type":"ContainerStarted","Data":"d8876f80d0cff4e05b2bb9d76059eed2f80bd1c407188ba2adf986e6b194f57e"} Feb 17 13:31:56 crc kubenswrapper[4804]: I0217 13:31:56.749452 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jhxhx" podStartSLOduration=2.190632011 podStartE2EDuration="6.749432373s" podCreationTimestamp="2026-02-17 13:31:50 +0000 UTC" firstStartedPulling="2026-02-17 13:31:51.675304977 +0000 UTC m=+385.786724314" lastFinishedPulling="2026-02-17 13:31:56.234105329 +0000 UTC m=+390.345524676" observedRunningTime="2026-02-17 13:31:56.7478151 +0000 UTC m=+390.859234437" watchObservedRunningTime="2026-02-17 13:31:56.749432373 +0000 UTC m=+390.860851710" Feb 17 13:31:56 crc kubenswrapper[4804]: I0217 13:31:56.773797 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m2bjw" podStartSLOduration=3.086527499 podStartE2EDuration="6.773777011s" podCreationTimestamp="2026-02-17 13:31:50 +0000 UTC" firstStartedPulling="2026-02-17 13:31:52.685777303 +0000 UTC m=+386.797196630" lastFinishedPulling="2026-02-17 13:31:56.373026805 +0000 UTC m=+390.484446142" observedRunningTime="2026-02-17 13:31:56.77102933 +0000 UTC m=+390.882448677" watchObservedRunningTime="2026-02-17 13:31:56.773777011 +0000 UTC m=+390.885196348" Feb 17 13:31:57 crc kubenswrapper[4804]: I0217 13:31:57.985011 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5fs82" Feb 17 13:31:57 crc kubenswrapper[4804]: I0217 13:31:57.985368 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5fs82" Feb 17 13:31:58 crc kubenswrapper[4804]: I0217 13:31:58.067034 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5fs82" Feb 17 13:31:58 crc kubenswrapper[4804]: I0217 13:31:58.556544 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bhcxz" Feb 17 13:31:58 crc kubenswrapper[4804]: I0217 13:31:58.556597 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bhcxz" Feb 17 13:31:58 crc kubenswrapper[4804]: I0217 13:31:58.781932 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5fs82" Feb 17 13:31:59 crc kubenswrapper[4804]: I0217 13:31:59.598548 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bhcxz" podUID="fdf90149-055d-48ca-9336-ca6d6545f8a3" containerName="registry-server" probeResult="failure" output=< Feb 17 13:31:59 crc kubenswrapper[4804]: timeout: failed to connect service ":50051" within 1s Feb 17 13:31:59 crc kubenswrapper[4804]: > Feb 17 13:32:00 crc kubenswrapper[4804]: I0217 13:32:00.349641 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jhxhx" Feb 17 13:32:00 crc kubenswrapper[4804]: I0217 13:32:00.349936 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jhxhx" Feb 17 13:32:00 crc kubenswrapper[4804]: I0217 13:32:00.389585 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jhxhx" Feb 17 13:32:00 crc kubenswrapper[4804]: I0217 13:32:00.967176 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m2bjw" Feb 17 13:32:00 crc kubenswrapper[4804]: I0217 13:32:00.967274 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m2bjw" Feb 17 13:32:01 crc kubenswrapper[4804]: I0217 13:32:01.018124 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m2bjw" Feb 17 13:32:01 crc kubenswrapper[4804]: I0217 13:32:01.791560 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m2bjw" Feb 17 13:32:05 crc kubenswrapper[4804]: I0217 13:32:05.908925 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-tzg5s" Feb 17 13:32:05 crc kubenswrapper[4804]: I0217 13:32:05.968559 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ggf6k"] Feb 17 13:32:08 crc kubenswrapper[4804]: I0217 13:32:08.623048 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bhcxz" Feb 17 13:32:08 crc kubenswrapper[4804]: I0217 13:32:08.698487 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bhcxz" Feb 17 13:32:10 crc kubenswrapper[4804]: I0217 13:32:10.397698 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jhxhx" Feb 17 13:32:25 crc kubenswrapper[4804]: I0217 13:32:25.835268 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:32:25 crc kubenswrapper[4804]: I0217 13:32:25.838049 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:32:25 crc kubenswrapper[4804]: I0217 13:32:25.838341 4804 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:32:25 crc kubenswrapper[4804]: I0217 13:32:25.839456 4804 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"45ca7a269d38a09ffdae5bd556d7eb92def23c8a4cf6319c270308b20dd056c7"} pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 13:32:25 crc kubenswrapper[4804]: I0217 13:32:25.839749 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" containerID="cri-o://45ca7a269d38a09ffdae5bd556d7eb92def23c8a4cf6319c270308b20dd056c7" gracePeriod=600 Feb 17 13:32:26 crc kubenswrapper[4804]: I0217 13:32:26.908245 4804 generic.go:334] "Generic (PLEG): container finished" podID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerID="45ca7a269d38a09ffdae5bd556d7eb92def23c8a4cf6319c270308b20dd056c7" exitCode=0 Feb 17 13:32:26 crc kubenswrapper[4804]: I0217 13:32:26.908307 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerDied","Data":"45ca7a269d38a09ffdae5bd556d7eb92def23c8a4cf6319c270308b20dd056c7"} Feb 17 13:32:26 crc kubenswrapper[4804]: I0217 13:32:26.908839 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerStarted","Data":"5551daa5df1d49c4efe0e9ec2a10b66e2ea57db472e185436f7abf62112f226d"} Feb 17 13:32:26 crc kubenswrapper[4804]: I0217 13:32:26.908869 4804 scope.go:117] "RemoveContainer" containerID="526cf05fcf3ae1b66378e03d3bdead52b79bddc7294523665171ada60cba034b" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.012860 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" podUID="b09fea83-e0d3-4a40-b186-8432c3fa7be0" containerName="registry" containerID="cri-o://9e9b6cc6c07f2c75051c25e6b523bc4176dc8b200979ccf83ba6ec3102993fc9" gracePeriod=30 Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.593271 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.674568 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b09fea83-e0d3-4a40-b186-8432c3fa7be0-ca-trust-extracted\") pod \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.674713 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b09fea83-e0d3-4a40-b186-8432c3fa7be0-trusted-ca\") pod \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.674784 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b09fea83-e0d3-4a40-b186-8432c3fa7be0-registry-certificates\") pod \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.674825 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b09fea83-e0d3-4a40-b186-8432c3fa7be0-registry-tls\") pod \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.674856 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqx9v\" (UniqueName: \"kubernetes.io/projected/b09fea83-e0d3-4a40-b186-8432c3fa7be0-kube-api-access-cqx9v\") pod \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.674896 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b09fea83-e0d3-4a40-b186-8432c3fa7be0-bound-sa-token\") pod \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.675080 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.675129 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b09fea83-e0d3-4a40-b186-8432c3fa7be0-installation-pull-secrets\") pod \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\" (UID: \"b09fea83-e0d3-4a40-b186-8432c3fa7be0\") " Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.675568 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b09fea83-e0d3-4a40-b186-8432c3fa7be0-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b09fea83-e0d3-4a40-b186-8432c3fa7be0" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.675706 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b09fea83-e0d3-4a40-b186-8432c3fa7be0-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b09fea83-e0d3-4a40-b186-8432c3fa7be0" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.680730 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b09fea83-e0d3-4a40-b186-8432c3fa7be0-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b09fea83-e0d3-4a40-b186-8432c3fa7be0" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.681113 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b09fea83-e0d3-4a40-b186-8432c3fa7be0-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b09fea83-e0d3-4a40-b186-8432c3fa7be0" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.681440 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b09fea83-e0d3-4a40-b186-8432c3fa7be0-kube-api-access-cqx9v" (OuterVolumeSpecName: "kube-api-access-cqx9v") pod "b09fea83-e0d3-4a40-b186-8432c3fa7be0" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0"). InnerVolumeSpecName "kube-api-access-cqx9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.681631 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b09fea83-e0d3-4a40-b186-8432c3fa7be0-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "b09fea83-e0d3-4a40-b186-8432c3fa7be0" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.691870 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b09fea83-e0d3-4a40-b186-8432c3fa7be0-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b09fea83-e0d3-4a40-b186-8432c3fa7be0" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.707847 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "b09fea83-e0d3-4a40-b186-8432c3fa7be0" (UID: "b09fea83-e0d3-4a40-b186-8432c3fa7be0"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.777074 4804 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b09fea83-e0d3-4a40-b186-8432c3fa7be0-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.777125 4804 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b09fea83-e0d3-4a40-b186-8432c3fa7be0-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.777140 4804 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b09fea83-e0d3-4a40-b186-8432c3fa7be0-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.777151 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqx9v\" (UniqueName: \"kubernetes.io/projected/b09fea83-e0d3-4a40-b186-8432c3fa7be0-kube-api-access-cqx9v\") on node \"crc\" DevicePath \"\"" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.777164 4804 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b09fea83-e0d3-4a40-b186-8432c3fa7be0-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.777177 4804 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b09fea83-e0d3-4a40-b186-8432c3fa7be0-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.777189 4804 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b09fea83-e0d3-4a40-b186-8432c3fa7be0-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.941096 4804 generic.go:334] "Generic (PLEG): container finished" podID="b09fea83-e0d3-4a40-b186-8432c3fa7be0" containerID="9e9b6cc6c07f2c75051c25e6b523bc4176dc8b200979ccf83ba6ec3102993fc9" exitCode=0 Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.941133 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" event={"ID":"b09fea83-e0d3-4a40-b186-8432c3fa7be0","Type":"ContainerDied","Data":"9e9b6cc6c07f2c75051c25e6b523bc4176dc8b200979ccf83ba6ec3102993fc9"} Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.941161 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" event={"ID":"b09fea83-e0d3-4a40-b186-8432c3fa7be0","Type":"ContainerDied","Data":"4dd741b3c38a0505bebb7c99e18c919af01e075e7767edd7ca2356d4e858351e"} Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.941179 4804 scope.go:117] "RemoveContainer" containerID="9e9b6cc6c07f2c75051c25e6b523bc4176dc8b200979ccf83ba6ec3102993fc9" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.941329 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ggf6k" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.961411 4804 scope.go:117] "RemoveContainer" containerID="9e9b6cc6c07f2c75051c25e6b523bc4176dc8b200979ccf83ba6ec3102993fc9" Feb 17 13:32:31 crc kubenswrapper[4804]: E0217 13:32:31.961902 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e9b6cc6c07f2c75051c25e6b523bc4176dc8b200979ccf83ba6ec3102993fc9\": container with ID starting with 9e9b6cc6c07f2c75051c25e6b523bc4176dc8b200979ccf83ba6ec3102993fc9 not found: ID does not exist" containerID="9e9b6cc6c07f2c75051c25e6b523bc4176dc8b200979ccf83ba6ec3102993fc9" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.961934 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e9b6cc6c07f2c75051c25e6b523bc4176dc8b200979ccf83ba6ec3102993fc9"} err="failed to get container status \"9e9b6cc6c07f2c75051c25e6b523bc4176dc8b200979ccf83ba6ec3102993fc9\": rpc error: code = NotFound desc = could not find container \"9e9b6cc6c07f2c75051c25e6b523bc4176dc8b200979ccf83ba6ec3102993fc9\": container with ID starting with 9e9b6cc6c07f2c75051c25e6b523bc4176dc8b200979ccf83ba6ec3102993fc9 not found: ID does not exist" Feb 17 13:32:31 crc kubenswrapper[4804]: I0217 13:32:31.994337 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ggf6k"] Feb 17 13:32:32 crc kubenswrapper[4804]: I0217 13:32:32.001007 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ggf6k"] Feb 17 13:32:32 crc kubenswrapper[4804]: I0217 13:32:32.586386 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b09fea83-e0d3-4a40-b186-8432c3fa7be0" path="/var/lib/kubelet/pods/b09fea83-e0d3-4a40-b186-8432c3fa7be0/volumes" Feb 17 13:34:55 crc kubenswrapper[4804]: I0217 13:34:55.836100 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:34:55 crc kubenswrapper[4804]: I0217 13:34:55.836778 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:35:25 crc kubenswrapper[4804]: I0217 13:35:25.835554 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:35:25 crc kubenswrapper[4804]: I0217 13:35:25.836159 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:35:26 crc kubenswrapper[4804]: I0217 13:35:26.778776 4804 scope.go:117] "RemoveContainer" containerID="a9ed597c3c00b14d9496b5cdcd3501fa4654fd60a6b054f4df6ff45fd2626a2f" Feb 17 13:35:55 crc kubenswrapper[4804]: I0217 13:35:55.835770 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:35:55 crc kubenswrapper[4804]: I0217 13:35:55.836503 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:35:55 crc kubenswrapper[4804]: I0217 13:35:55.836566 4804 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:35:55 crc kubenswrapper[4804]: I0217 13:35:55.837396 4804 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5551daa5df1d49c4efe0e9ec2a10b66e2ea57db472e185436f7abf62112f226d"} pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 13:35:55 crc kubenswrapper[4804]: I0217 13:35:55.837489 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" containerID="cri-o://5551daa5df1d49c4efe0e9ec2a10b66e2ea57db472e185436f7abf62112f226d" gracePeriod=600 Feb 17 13:35:56 crc kubenswrapper[4804]: I0217 13:35:56.188776 4804 generic.go:334] "Generic (PLEG): container finished" podID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerID="5551daa5df1d49c4efe0e9ec2a10b66e2ea57db472e185436f7abf62112f226d" exitCode=0 Feb 17 13:35:56 crc kubenswrapper[4804]: I0217 13:35:56.188865 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerDied","Data":"5551daa5df1d49c4efe0e9ec2a10b66e2ea57db472e185436f7abf62112f226d"} Feb 17 13:35:56 crc kubenswrapper[4804]: I0217 13:35:56.189113 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerStarted","Data":"8de55925453e9c90c2dd998f586db937cfd6d8bf2a763548f6a43c49f5395c8e"} Feb 17 13:35:56 crc kubenswrapper[4804]: I0217 13:35:56.189135 4804 scope.go:117] "RemoveContainer" containerID="45ca7a269d38a09ffdae5bd556d7eb92def23c8a4cf6319c270308b20dd056c7" Feb 17 13:38:03 crc kubenswrapper[4804]: I0217 13:38:03.817851 4804 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 17 13:38:25 crc kubenswrapper[4804]: I0217 13:38:25.835082 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:38:25 crc kubenswrapper[4804]: I0217 13:38:25.835802 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:38:55 crc kubenswrapper[4804]: I0217 13:38:55.834925 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:38:55 crc kubenswrapper[4804]: I0217 13:38:55.835611 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:39:21 crc kubenswrapper[4804]: I0217 13:39:21.938068 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-kbdz5"] Feb 17 13:39:21 crc kubenswrapper[4804]: E0217 13:39:21.938796 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b09fea83-e0d3-4a40-b186-8432c3fa7be0" containerName="registry" Feb 17 13:39:21 crc kubenswrapper[4804]: I0217 13:39:21.938808 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="b09fea83-e0d3-4a40-b186-8432c3fa7be0" containerName="registry" Feb 17 13:39:21 crc kubenswrapper[4804]: I0217 13:39:21.938923 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="b09fea83-e0d3-4a40-b186-8432c3fa7be0" containerName="registry" Feb 17 13:39:21 crc kubenswrapper[4804]: I0217 13:39:21.939353 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-kbdz5" Feb 17 13:39:21 crc kubenswrapper[4804]: I0217 13:39:21.945652 4804 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-j5t89" Feb 17 13:39:21 crc kubenswrapper[4804]: I0217 13:39:21.945734 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 17 13:39:21 crc kubenswrapper[4804]: I0217 13:39:21.945788 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 17 13:39:21 crc kubenswrapper[4804]: I0217 13:39:21.949327 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-7sfkb"] Feb 17 13:39:21 crc kubenswrapper[4804]: I0217 13:39:21.950183 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-7sfkb" Feb 17 13:39:21 crc kubenswrapper[4804]: I0217 13:39:21.952746 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-kbdz5"] Feb 17 13:39:21 crc kubenswrapper[4804]: I0217 13:39:21.956076 4804 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-pzj97" Feb 17 13:39:21 crc kubenswrapper[4804]: I0217 13:39:21.959563 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-7sfkb"] Feb 17 13:39:21 crc kubenswrapper[4804]: I0217 13:39:21.982636 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-c8nh8"] Feb 17 13:39:21 crc kubenswrapper[4804]: I0217 13:39:21.983658 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-c8nh8" Feb 17 13:39:21 crc kubenswrapper[4804]: I0217 13:39:21.985686 4804 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-l8nlf" Feb 17 13:39:22 crc kubenswrapper[4804]: I0217 13:39:22.000324 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-c8nh8"] Feb 17 13:39:22 crc kubenswrapper[4804]: I0217 13:39:22.011275 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wgbf\" (UniqueName: \"kubernetes.io/projected/be70f757-4537-489d-a86e-a1b49fc9af75-kube-api-access-7wgbf\") pod \"cert-manager-webhook-687f57d79b-c8nh8\" (UID: \"be70f757-4537-489d-a86e-a1b49fc9af75\") " pod="cert-manager/cert-manager-webhook-687f57d79b-c8nh8" Feb 17 13:39:22 crc kubenswrapper[4804]: I0217 13:39:22.011392 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chqqn\" (UniqueName: \"kubernetes.io/projected/9d2d8008-6348-4f24-8085-d30db8558ab3-kube-api-access-chqqn\") pod \"cert-manager-cainjector-cf98fcc89-kbdz5\" (UID: \"9d2d8008-6348-4f24-8085-d30db8558ab3\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-kbdz5" Feb 17 13:39:22 crc kubenswrapper[4804]: I0217 13:39:22.011658 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42s5l\" (UniqueName: \"kubernetes.io/projected/112c357f-f1dc-4a07-bba0-ddf54ab071ff-kube-api-access-42s5l\") pod \"cert-manager-858654f9db-7sfkb\" (UID: \"112c357f-f1dc-4a07-bba0-ddf54ab071ff\") " pod="cert-manager/cert-manager-858654f9db-7sfkb" Feb 17 13:39:22 crc kubenswrapper[4804]: I0217 13:39:22.112977 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42s5l\" (UniqueName: \"kubernetes.io/projected/112c357f-f1dc-4a07-bba0-ddf54ab071ff-kube-api-access-42s5l\") pod \"cert-manager-858654f9db-7sfkb\" (UID: \"112c357f-f1dc-4a07-bba0-ddf54ab071ff\") " pod="cert-manager/cert-manager-858654f9db-7sfkb" Feb 17 13:39:22 crc kubenswrapper[4804]: I0217 13:39:22.113529 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wgbf\" (UniqueName: \"kubernetes.io/projected/be70f757-4537-489d-a86e-a1b49fc9af75-kube-api-access-7wgbf\") pod \"cert-manager-webhook-687f57d79b-c8nh8\" (UID: \"be70f757-4537-489d-a86e-a1b49fc9af75\") " pod="cert-manager/cert-manager-webhook-687f57d79b-c8nh8" Feb 17 13:39:22 crc kubenswrapper[4804]: I0217 13:39:22.113799 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chqqn\" (UniqueName: \"kubernetes.io/projected/9d2d8008-6348-4f24-8085-d30db8558ab3-kube-api-access-chqqn\") pod \"cert-manager-cainjector-cf98fcc89-kbdz5\" (UID: \"9d2d8008-6348-4f24-8085-d30db8558ab3\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-kbdz5" Feb 17 13:39:22 crc kubenswrapper[4804]: I0217 13:39:22.134005 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42s5l\" (UniqueName: \"kubernetes.io/projected/112c357f-f1dc-4a07-bba0-ddf54ab071ff-kube-api-access-42s5l\") pod \"cert-manager-858654f9db-7sfkb\" (UID: \"112c357f-f1dc-4a07-bba0-ddf54ab071ff\") " pod="cert-manager/cert-manager-858654f9db-7sfkb" Feb 17 13:39:22 crc kubenswrapper[4804]: I0217 13:39:22.134290 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wgbf\" (UniqueName: \"kubernetes.io/projected/be70f757-4537-489d-a86e-a1b49fc9af75-kube-api-access-7wgbf\") pod \"cert-manager-webhook-687f57d79b-c8nh8\" (UID: \"be70f757-4537-489d-a86e-a1b49fc9af75\") " pod="cert-manager/cert-manager-webhook-687f57d79b-c8nh8" Feb 17 13:39:22 crc kubenswrapper[4804]: I0217 13:39:22.135150 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chqqn\" (UniqueName: \"kubernetes.io/projected/9d2d8008-6348-4f24-8085-d30db8558ab3-kube-api-access-chqqn\") pod \"cert-manager-cainjector-cf98fcc89-kbdz5\" (UID: \"9d2d8008-6348-4f24-8085-d30db8558ab3\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-kbdz5" Feb 17 13:39:22 crc kubenswrapper[4804]: I0217 13:39:22.266488 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-kbdz5" Feb 17 13:39:22 crc kubenswrapper[4804]: I0217 13:39:22.278288 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-7sfkb" Feb 17 13:39:22 crc kubenswrapper[4804]: I0217 13:39:22.301334 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-c8nh8" Feb 17 13:39:22 crc kubenswrapper[4804]: I0217 13:39:22.583428 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-c8nh8"] Feb 17 13:39:22 crc kubenswrapper[4804]: I0217 13:39:22.584486 4804 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 13:39:22 crc kubenswrapper[4804]: I0217 13:39:22.717401 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-kbdz5"] Feb 17 13:39:22 crc kubenswrapper[4804]: W0217 13:39:22.720684 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod112c357f_f1dc_4a07_bba0_ddf54ab071ff.slice/crio-682cda7d7c57a849494879eadac39bc4bbb373d1f8d21ffc47a38373417b487f WatchSource:0}: Error finding container 682cda7d7c57a849494879eadac39bc4bbb373d1f8d21ffc47a38373417b487f: Status 404 returned error can't find the container with id 682cda7d7c57a849494879eadac39bc4bbb373d1f8d21ffc47a38373417b487f Feb 17 13:39:22 crc kubenswrapper[4804]: W0217 13:39:22.720963 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d2d8008_6348_4f24_8085_d30db8558ab3.slice/crio-3ac2a541c69b6e55279e8db24111ce294f8bd7e70d15f3ad8a6daa4beb60eb7c WatchSource:0}: Error finding container 3ac2a541c69b6e55279e8db24111ce294f8bd7e70d15f3ad8a6daa4beb60eb7c: Status 404 returned error can't find the container with id 3ac2a541c69b6e55279e8db24111ce294f8bd7e70d15f3ad8a6daa4beb60eb7c Feb 17 13:39:22 crc kubenswrapper[4804]: I0217 13:39:22.721390 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-7sfkb"] Feb 17 13:39:23 crc kubenswrapper[4804]: I0217 13:39:23.512681 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-c8nh8" event={"ID":"be70f757-4537-489d-a86e-a1b49fc9af75","Type":"ContainerStarted","Data":"f35a35053b8ad7ea21de12e2f8f4752ea28348753de71473652ddc0a8b819cc0"} Feb 17 13:39:23 crc kubenswrapper[4804]: I0217 13:39:23.514113 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-kbdz5" event={"ID":"9d2d8008-6348-4f24-8085-d30db8558ab3","Type":"ContainerStarted","Data":"3ac2a541c69b6e55279e8db24111ce294f8bd7e70d15f3ad8a6daa4beb60eb7c"} Feb 17 13:39:23 crc kubenswrapper[4804]: I0217 13:39:23.515266 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-7sfkb" event={"ID":"112c357f-f1dc-4a07-bba0-ddf54ab071ff","Type":"ContainerStarted","Data":"682cda7d7c57a849494879eadac39bc4bbb373d1f8d21ffc47a38373417b487f"} Feb 17 13:39:25 crc kubenswrapper[4804]: I0217 13:39:25.528050 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-c8nh8" event={"ID":"be70f757-4537-489d-a86e-a1b49fc9af75","Type":"ContainerStarted","Data":"b86e05c21bbb75cbb64d4e55e95d54ac4310f8e63ca1474537f83c89b7356cf3"} Feb 17 13:39:25 crc kubenswrapper[4804]: I0217 13:39:25.528408 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-c8nh8" Feb 17 13:39:25 crc kubenswrapper[4804]: I0217 13:39:25.551298 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-c8nh8" podStartSLOduration=2.18264351 podStartE2EDuration="4.55127749s" podCreationTimestamp="2026-02-17 13:39:21 +0000 UTC" firstStartedPulling="2026-02-17 13:39:22.584309772 +0000 UTC m=+836.695729109" lastFinishedPulling="2026-02-17 13:39:24.952943732 +0000 UTC m=+839.064363089" observedRunningTime="2026-02-17 13:39:25.544650372 +0000 UTC m=+839.656069709" watchObservedRunningTime="2026-02-17 13:39:25.55127749 +0000 UTC m=+839.662696827" Feb 17 13:39:25 crc kubenswrapper[4804]: I0217 13:39:25.834984 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:39:25 crc kubenswrapper[4804]: I0217 13:39:25.835072 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:39:25 crc kubenswrapper[4804]: I0217 13:39:25.835137 4804 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:39:25 crc kubenswrapper[4804]: I0217 13:39:25.836297 4804 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8de55925453e9c90c2dd998f586db937cfd6d8bf2a763548f6a43c49f5395c8e"} pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 13:39:25 crc kubenswrapper[4804]: I0217 13:39:25.836405 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" containerID="cri-o://8de55925453e9c90c2dd998f586db937cfd6d8bf2a763548f6a43c49f5395c8e" gracePeriod=600 Feb 17 13:39:26 crc kubenswrapper[4804]: I0217 13:39:26.533611 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-kbdz5" event={"ID":"9d2d8008-6348-4f24-8085-d30db8558ab3","Type":"ContainerStarted","Data":"33e8ba71037253aef0a408726ffab75c3c60d12eebd0af388def5be96bf29eca"} Feb 17 13:39:26 crc kubenswrapper[4804]: I0217 13:39:26.535697 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-7sfkb" event={"ID":"112c357f-f1dc-4a07-bba0-ddf54ab071ff","Type":"ContainerStarted","Data":"5250fbfcb90f23c254a18fdfafde07d341317998eeb98431d0d8b10985a9c93a"} Feb 17 13:39:26 crc kubenswrapper[4804]: I0217 13:39:26.539509 4804 generic.go:334] "Generic (PLEG): container finished" podID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerID="8de55925453e9c90c2dd998f586db937cfd6d8bf2a763548f6a43c49f5395c8e" exitCode=0 Feb 17 13:39:26 crc kubenswrapper[4804]: I0217 13:39:26.539566 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerDied","Data":"8de55925453e9c90c2dd998f586db937cfd6d8bf2a763548f6a43c49f5395c8e"} Feb 17 13:39:26 crc kubenswrapper[4804]: I0217 13:39:26.539596 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerStarted","Data":"0320866c2bb2dbd13ef711a6f5701e23927765988c8998787dbdeb879aaaaa69"} Feb 17 13:39:26 crc kubenswrapper[4804]: I0217 13:39:26.539612 4804 scope.go:117] "RemoveContainer" containerID="5551daa5df1d49c4efe0e9ec2a10b66e2ea57db472e185436f7abf62112f226d" Feb 17 13:39:26 crc kubenswrapper[4804]: I0217 13:39:26.551405 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-kbdz5" podStartSLOduration=1.967981799 podStartE2EDuration="5.55138104s" podCreationTimestamp="2026-02-17 13:39:21 +0000 UTC" firstStartedPulling="2026-02-17 13:39:22.723344329 +0000 UTC m=+836.834763666" lastFinishedPulling="2026-02-17 13:39:26.30674357 +0000 UTC m=+840.418162907" observedRunningTime="2026-02-17 13:39:26.548115467 +0000 UTC m=+840.659534824" watchObservedRunningTime="2026-02-17 13:39:26.55138104 +0000 UTC m=+840.662800377" Feb 17 13:39:26 crc kubenswrapper[4804]: I0217 13:39:26.589818 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-7sfkb" podStartSLOduration=2.006151632 podStartE2EDuration="5.589800551s" podCreationTimestamp="2026-02-17 13:39:21 +0000 UTC" firstStartedPulling="2026-02-17 13:39:22.723342229 +0000 UTC m=+836.834761566" lastFinishedPulling="2026-02-17 13:39:26.306991128 +0000 UTC m=+840.418410485" observedRunningTime="2026-02-17 13:39:26.587912462 +0000 UTC m=+840.699331799" watchObservedRunningTime="2026-02-17 13:39:26.589800551 +0000 UTC m=+840.701219888" Feb 17 13:39:31 crc kubenswrapper[4804]: I0217 13:39:31.954126 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-v8mv6"] Feb 17 13:39:31 crc kubenswrapper[4804]: I0217 13:39:31.955086 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovn-controller" containerID="cri-o://6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79" gracePeriod=30 Feb 17 13:39:31 crc kubenswrapper[4804]: I0217 13:39:31.955137 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="nbdb" containerID="cri-o://cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054" gracePeriod=30 Feb 17 13:39:31 crc kubenswrapper[4804]: I0217 13:39:31.955278 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="sbdb" containerID="cri-o://d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd" gracePeriod=30 Feb 17 13:39:31 crc kubenswrapper[4804]: I0217 13:39:31.955246 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7" gracePeriod=30 Feb 17 13:39:31 crc kubenswrapper[4804]: I0217 13:39:31.955357 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="northd" containerID="cri-o://d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb" gracePeriod=30 Feb 17 13:39:31 crc kubenswrapper[4804]: I0217 13:39:31.955329 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="kube-rbac-proxy-node" containerID="cri-o://94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb" gracePeriod=30 Feb 17 13:39:31 crc kubenswrapper[4804]: I0217 13:39:31.955296 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovn-acl-logging" containerID="cri-o://8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449" gracePeriod=30 Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.000449 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovnkube-controller" containerID="cri-o://0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96" gracePeriod=30 Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.303194 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-c8nh8" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.356302 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v8mv6_8df4e52a-e578-472b-a6b3-418e9755714f/ovnkube-controller/3.log" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.358819 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v8mv6_8df4e52a-e578-472b-a6b3-418e9755714f/ovn-acl-logging/0.log" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.359436 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v8mv6_8df4e52a-e578-472b-a6b3-418e9755714f/ovn-controller/0.log" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.359919 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.431796 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-q64t2"] Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.432174 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovnkube-controller" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.432247 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovnkube-controller" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.432274 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="sbdb" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.432289 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="sbdb" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.432310 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovnkube-controller" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.432329 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovnkube-controller" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.432349 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovn-acl-logging" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.432364 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovn-acl-logging" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.432395 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="kube-rbac-proxy-ovn-metrics" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.432411 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="kube-rbac-proxy-ovn-metrics" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.432433 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovnkube-controller" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.432449 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovnkube-controller" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.432474 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="nbdb" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.432488 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="nbdb" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.432507 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovn-controller" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.432523 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovn-controller" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.432545 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="kubecfg-setup" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.432560 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="kubecfg-setup" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.432581 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="kube-rbac-proxy-node" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.432597 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="kube-rbac-proxy-node" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.432624 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="northd" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.432641 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="northd" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.432850 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovnkube-controller" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.432879 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="sbdb" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.432898 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="nbdb" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.432922 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovnkube-controller" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.432941 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="kube-rbac-proxy-node" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.432968 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovn-controller" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.432988 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="northd" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.433013 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovnkube-controller" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.433032 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="kube-rbac-proxy-ovn-metrics" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.433056 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovnkube-controller" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.433079 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovn-acl-logging" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.433760 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovnkube-controller" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.433800 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovnkube-controller" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.433828 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovnkube-controller" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.433847 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovnkube-controller" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.434117 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" containerName="ovnkube-controller" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435135 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8df4e52a-e578-472b-a6b3-418e9755714f-ovnkube-config\") pod \"8df4e52a-e578-472b-a6b3-418e9755714f\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435184 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-var-lib-openvswitch\") pod \"8df4e52a-e578-472b-a6b3-418e9755714f\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435236 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8df4e52a-e578-472b-a6b3-418e9755714f-ovnkube-script-lib\") pod \"8df4e52a-e578-472b-a6b3-418e9755714f\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435283 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-systemd-units\") pod \"8df4e52a-e578-472b-a6b3-418e9755714f\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435307 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8df4e52a-e578-472b-a6b3-418e9755714f-ovn-node-metrics-cert\") pod \"8df4e52a-e578-472b-a6b3-418e9755714f\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435332 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-run-netns\") pod \"8df4e52a-e578-472b-a6b3-418e9755714f\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435332 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "8df4e52a-e578-472b-a6b3-418e9755714f" (UID: "8df4e52a-e578-472b-a6b3-418e9755714f"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435352 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"8df4e52a-e578-472b-a6b3-418e9755714f\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435412 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "8df4e52a-e578-472b-a6b3-418e9755714f" (UID: "8df4e52a-e578-472b-a6b3-418e9755714f"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435434 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8df4e52a-e578-472b-a6b3-418e9755714f-env-overrides\") pod \"8df4e52a-e578-472b-a6b3-418e9755714f\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435443 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "8df4e52a-e578-472b-a6b3-418e9755714f" (UID: "8df4e52a-e578-472b-a6b3-418e9755714f"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435451 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "8df4e52a-e578-472b-a6b3-418e9755714f" (UID: "8df4e52a-e578-472b-a6b3-418e9755714f"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435481 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-kubelet\") pod \"8df4e52a-e578-472b-a6b3-418e9755714f\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435507 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-etc-openvswitch\") pod \"8df4e52a-e578-472b-a6b3-418e9755714f\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435526 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-run-systemd\") pod \"8df4e52a-e578-472b-a6b3-418e9755714f\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435526 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "8df4e52a-e578-472b-a6b3-418e9755714f" (UID: "8df4e52a-e578-472b-a6b3-418e9755714f"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435562 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-cni-bin\") pod \"8df4e52a-e578-472b-a6b3-418e9755714f\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435549 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "8df4e52a-e578-472b-a6b3-418e9755714f" (UID: "8df4e52a-e578-472b-a6b3-418e9755714f"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435590 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-run-ovn\") pod \"8df4e52a-e578-472b-a6b3-418e9755714f\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435617 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-slash\") pod \"8df4e52a-e578-472b-a6b3-418e9755714f\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435649 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-run-openvswitch\") pod \"8df4e52a-e578-472b-a6b3-418e9755714f\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435677 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "8df4e52a-e578-472b-a6b3-418e9755714f" (UID: "8df4e52a-e578-472b-a6b3-418e9755714f"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435685 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-cni-netd\") pod \"8df4e52a-e578-472b-a6b3-418e9755714f\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435712 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75nhb\" (UniqueName: \"kubernetes.io/projected/8df4e52a-e578-472b-a6b3-418e9755714f-kube-api-access-75nhb\") pod \"8df4e52a-e578-472b-a6b3-418e9755714f\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435678 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "8df4e52a-e578-472b-a6b3-418e9755714f" (UID: "8df4e52a-e578-472b-a6b3-418e9755714f"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435744 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-run-ovn-kubernetes\") pod \"8df4e52a-e578-472b-a6b3-418e9755714f\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435707 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-slash" (OuterVolumeSpecName: "host-slash") pod "8df4e52a-e578-472b-a6b3-418e9755714f" (UID: "8df4e52a-e578-472b-a6b3-418e9755714f"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435707 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "8df4e52a-e578-472b-a6b3-418e9755714f" (UID: "8df4e52a-e578-472b-a6b3-418e9755714f"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435769 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-log-socket\") pod \"8df4e52a-e578-472b-a6b3-418e9755714f\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435788 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-node-log\") pod \"8df4e52a-e578-472b-a6b3-418e9755714f\" (UID: \"8df4e52a-e578-472b-a6b3-418e9755714f\") " Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435729 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "8df4e52a-e578-472b-a6b3-418e9755714f" (UID: "8df4e52a-e578-472b-a6b3-418e9755714f"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435809 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8df4e52a-e578-472b-a6b3-418e9755714f-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "8df4e52a-e578-472b-a6b3-418e9755714f" (UID: "8df4e52a-e578-472b-a6b3-418e9755714f"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435830 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8df4e52a-e578-472b-a6b3-418e9755714f-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "8df4e52a-e578-472b-a6b3-418e9755714f" (UID: "8df4e52a-e578-472b-a6b3-418e9755714f"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435788 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "8df4e52a-e578-472b-a6b3-418e9755714f" (UID: "8df4e52a-e578-472b-a6b3-418e9755714f"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435809 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-log-socket" (OuterVolumeSpecName: "log-socket") pod "8df4e52a-e578-472b-a6b3-418e9755714f" (UID: "8df4e52a-e578-472b-a6b3-418e9755714f"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.435945 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-node-log" (OuterVolumeSpecName: "node-log") pod "8df4e52a-e578-472b-a6b3-418e9755714f" (UID: "8df4e52a-e578-472b-a6b3-418e9755714f"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.436112 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8df4e52a-e578-472b-a6b3-418e9755714f-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "8df4e52a-e578-472b-a6b3-418e9755714f" (UID: "8df4e52a-e578-472b-a6b3-418e9755714f"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.436219 4804 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.436234 4804 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-log-socket\") on node \"crc\" DevicePath \"\"" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.436245 4804 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-node-log\") on node \"crc\" DevicePath \"\"" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.436254 4804 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8df4e52a-e578-472b-a6b3-418e9755714f-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.436267 4804 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.436278 4804 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8df4e52a-e578-472b-a6b3-418e9755714f-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.436289 4804 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.436300 4804 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.436311 4804 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.436325 4804 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8df4e52a-e578-472b-a6b3-418e9755714f-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.436339 4804 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.436350 4804 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.436360 4804 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.436370 4804 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.436379 4804 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-slash\") on node \"crc\" DevicePath \"\"" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.436389 4804 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.436400 4804 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.437369 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.441016 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8df4e52a-e578-472b-a6b3-418e9755714f-kube-api-access-75nhb" (OuterVolumeSpecName: "kube-api-access-75nhb") pod "8df4e52a-e578-472b-a6b3-418e9755714f" (UID: "8df4e52a-e578-472b-a6b3-418e9755714f"). InnerVolumeSpecName "kube-api-access-75nhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.441116 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8df4e52a-e578-472b-a6b3-418e9755714f-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "8df4e52a-e578-472b-a6b3-418e9755714f" (UID: "8df4e52a-e578-472b-a6b3-418e9755714f"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.453951 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "8df4e52a-e578-472b-a6b3-418e9755714f" (UID: "8df4e52a-e578-472b-a6b3-418e9755714f"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.537567 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-systemd-units\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.537609 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-etc-openvswitch\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.537645 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-log-socket\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.537668 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-cni-bin\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.537741 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/33fa0baa-0a4a-41c5-976e-5c7f60828272-env-overrides\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.537789 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-cni-netd\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.537806 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/33fa0baa-0a4a-41c5-976e-5c7f60828272-ovn-node-metrics-cert\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.537831 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-node-log\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.537876 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-kubelet\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.537891 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-run-systemd\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.537936 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.537957 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-run-ovn\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.537972 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-run-ovn-kubernetes\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.538127 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-run-netns\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.538186 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/33fa0baa-0a4a-41c5-976e-5c7f60828272-ovnkube-script-lib\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.538300 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/33fa0baa-0a4a-41c5-976e-5c7f60828272-ovnkube-config\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.538328 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s245\" (UniqueName: \"kubernetes.io/projected/33fa0baa-0a4a-41c5-976e-5c7f60828272-kube-api-access-5s245\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.538375 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-run-openvswitch\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.538538 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-var-lib-openvswitch\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.538568 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-slash\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.538630 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75nhb\" (UniqueName: \"kubernetes.io/projected/8df4e52a-e578-472b-a6b3-418e9755714f-kube-api-access-75nhb\") on node \"crc\" DevicePath \"\"" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.538645 4804 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8df4e52a-e578-472b-a6b3-418e9755714f-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.538654 4804 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8df4e52a-e578-472b-a6b3-418e9755714f-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.575166 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v8mv6_8df4e52a-e578-472b-a6b3-418e9755714f/ovnkube-controller/3.log" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.577807 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v8mv6_8df4e52a-e578-472b-a6b3-418e9755714f/ovn-acl-logging/0.log" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.578926 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v8mv6_8df4e52a-e578-472b-a6b3-418e9755714f/ovn-controller/0.log" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.579312 4804 generic.go:334] "Generic (PLEG): container finished" podID="8df4e52a-e578-472b-a6b3-418e9755714f" containerID="0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96" exitCode=0 Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.579337 4804 generic.go:334] "Generic (PLEG): container finished" podID="8df4e52a-e578-472b-a6b3-418e9755714f" containerID="d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd" exitCode=0 Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.579363 4804 generic.go:334] "Generic (PLEG): container finished" podID="8df4e52a-e578-472b-a6b3-418e9755714f" containerID="cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054" exitCode=0 Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.579371 4804 generic.go:334] "Generic (PLEG): container finished" podID="8df4e52a-e578-472b-a6b3-418e9755714f" containerID="d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb" exitCode=0 Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.579378 4804 generic.go:334] "Generic (PLEG): container finished" podID="8df4e52a-e578-472b-a6b3-418e9755714f" containerID="6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7" exitCode=0 Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.579386 4804 generic.go:334] "Generic (PLEG): container finished" podID="8df4e52a-e578-472b-a6b3-418e9755714f" containerID="94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb" exitCode=0 Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.579393 4804 generic.go:334] "Generic (PLEG): container finished" podID="8df4e52a-e578-472b-a6b3-418e9755714f" containerID="8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449" exitCode=143 Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.579401 4804 generic.go:334] "Generic (PLEG): container finished" podID="8df4e52a-e578-472b-a6b3-418e9755714f" containerID="6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79" exitCode=143 Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.579438 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580559 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerDied","Data":"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580604 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerDied","Data":"d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580621 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerDied","Data":"cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580636 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerDied","Data":"d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580650 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerDied","Data":"6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580661 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerDied","Data":"94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580673 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580689 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580695 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580703 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580709 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580715 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580721 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580728 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580735 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580746 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerDied","Data":"8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580756 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580763 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580769 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580775 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580783 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580789 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580795 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580801 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580808 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580815 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580825 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerDied","Data":"6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580836 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580855 4804 scope.go:117] "RemoveContainer" containerID="0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580864 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580967 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580978 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580985 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.580992 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.581000 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.581006 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.581013 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.581020 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.581031 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8mv6" event={"ID":"8df4e52a-e578-472b-a6b3-418e9755714f","Type":"ContainerDied","Data":"a80f9c965ade76b1702626786407637ac7c475f156f06af4c297248b43c44248"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.581044 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.581051 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.581057 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.581063 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.581069 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.581075 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.581082 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.581089 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.581095 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.581102 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.582821 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kclvs_42eec48d-c990-43e6-8348-d9f78997ec3b/kube-multus/2.log" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.583311 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kclvs_42eec48d-c990-43e6-8348-d9f78997ec3b/kube-multus/1.log" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.583343 4804 generic.go:334] "Generic (PLEG): container finished" podID="42eec48d-c990-43e6-8348-d9f78997ec3b" containerID="89324956d07c3785169619878354108e896d85eaace9f0e642b1b5ee9a981bde" exitCode=2 Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.583370 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kclvs" event={"ID":"42eec48d-c990-43e6-8348-d9f78997ec3b","Type":"ContainerDied","Data":"89324956d07c3785169619878354108e896d85eaace9f0e642b1b5ee9a981bde"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.583391 4804 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2df81c301857cf67d2883ea019d2d5fbd31c4c5dea2d9b5c4bfb19302b4cb03a"} Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.584542 4804 scope.go:117] "RemoveContainer" containerID="89324956d07c3785169619878354108e896d85eaace9f0e642b1b5ee9a981bde" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.609638 4804 scope.go:117] "RemoveContainer" containerID="6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.627360 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-v8mv6"] Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.631833 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-v8mv6"] Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.632044 4804 scope.go:117] "RemoveContainer" containerID="d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.639426 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-var-lib-openvswitch\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.639466 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-slash\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.639497 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-systemd-units\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.639519 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-etc-openvswitch\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.639544 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-log-socket\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.639568 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-cni-bin\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.639592 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/33fa0baa-0a4a-41c5-976e-5c7f60828272-env-overrides\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.639613 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/33fa0baa-0a4a-41c5-976e-5c7f60828272-ovn-node-metrics-cert\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.639636 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-cni-netd\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.639643 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-slash\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.639655 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-var-lib-openvswitch\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.639656 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-node-log\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.639724 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-node-log\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.639724 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-systemd-units\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.639778 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-log-socket\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.639856 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-etc-openvswitch\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.639858 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-cni-bin\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.639880 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-cni-netd\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.640054 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-kubelet\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.640088 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-run-systemd\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.640164 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.640220 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-run-systemd\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.640222 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-kubelet\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.640243 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-run-ovn-kubernetes\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.640270 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-run-ovn\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.640315 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-run-ovn\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.640295 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-run-ovn-kubernetes\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.640273 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.640411 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-run-netns\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.640440 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/33fa0baa-0a4a-41c5-976e-5c7f60828272-ovnkube-script-lib\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.640517 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/33fa0baa-0a4a-41c5-976e-5c7f60828272-ovnkube-config\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.640579 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s245\" (UniqueName: \"kubernetes.io/projected/33fa0baa-0a4a-41c5-976e-5c7f60828272-kube-api-access-5s245\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.640638 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-run-openvswitch\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.640949 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/33fa0baa-0a4a-41c5-976e-5c7f60828272-env-overrides\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.640527 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-host-run-netns\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.641060 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33fa0baa-0a4a-41c5-976e-5c7f60828272-run-openvswitch\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.641243 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/33fa0baa-0a4a-41c5-976e-5c7f60828272-ovnkube-script-lib\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.642989 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/33fa0baa-0a4a-41c5-976e-5c7f60828272-ovnkube-config\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.645741 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/33fa0baa-0a4a-41c5-976e-5c7f60828272-ovn-node-metrics-cert\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.662657 4804 scope.go:117] "RemoveContainer" containerID="cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.664642 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s245\" (UniqueName: \"kubernetes.io/projected/33fa0baa-0a4a-41c5-976e-5c7f60828272-kube-api-access-5s245\") pod \"ovnkube-node-q64t2\" (UID: \"33fa0baa-0a4a-41c5-976e-5c7f60828272\") " pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.681879 4804 scope.go:117] "RemoveContainer" containerID="d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.698918 4804 scope.go:117] "RemoveContainer" containerID="6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.713605 4804 scope.go:117] "RemoveContainer" containerID="94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.728451 4804 scope.go:117] "RemoveContainer" containerID="8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.741529 4804 scope.go:117] "RemoveContainer" containerID="6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.754045 4804 scope.go:117] "RemoveContainer" containerID="cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.754783 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.780400 4804 scope.go:117] "RemoveContainer" containerID="0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.780906 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96\": container with ID starting with 0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96 not found: ID does not exist" containerID="0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.780938 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96"} err="failed to get container status \"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96\": rpc error: code = NotFound desc = could not find container \"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96\": container with ID starting with 0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.780962 4804 scope.go:117] "RemoveContainer" containerID="6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.781441 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe\": container with ID starting with 6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe not found: ID does not exist" containerID="6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.781462 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe"} err="failed to get container status \"6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe\": rpc error: code = NotFound desc = could not find container \"6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe\": container with ID starting with 6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.781474 4804 scope.go:117] "RemoveContainer" containerID="d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.781763 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\": container with ID starting with d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd not found: ID does not exist" containerID="d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.781876 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd"} err="failed to get container status \"d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\": rpc error: code = NotFound desc = could not find container \"d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\": container with ID starting with d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.781986 4804 scope.go:117] "RemoveContainer" containerID="cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.782360 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\": container with ID starting with cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054 not found: ID does not exist" containerID="cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.782382 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054"} err="failed to get container status \"cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\": rpc error: code = NotFound desc = could not find container \"cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\": container with ID starting with cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.782398 4804 scope.go:117] "RemoveContainer" containerID="d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.782703 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\": container with ID starting with d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb not found: ID does not exist" containerID="d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.782809 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb"} err="failed to get container status \"d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\": rpc error: code = NotFound desc = could not find container \"d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\": container with ID starting with d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.782889 4804 scope.go:117] "RemoveContainer" containerID="6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.783212 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\": container with ID starting with 6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7 not found: ID does not exist" containerID="6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.783249 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7"} err="failed to get container status \"6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\": rpc error: code = NotFound desc = could not find container \"6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\": container with ID starting with 6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.783269 4804 scope.go:117] "RemoveContainer" containerID="94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.783541 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\": container with ID starting with 94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb not found: ID does not exist" containerID="94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.783635 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb"} err="failed to get container status \"94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\": rpc error: code = NotFound desc = could not find container \"94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\": container with ID starting with 94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.783726 4804 scope.go:117] "RemoveContainer" containerID="8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449" Feb 17 13:39:32 crc kubenswrapper[4804]: W0217 13:39:32.783850 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33fa0baa_0a4a_41c5_976e_5c7f60828272.slice/crio-1f7802bb10116c11f42c676d5fbd1cacfd87b9d9cf4ea7f8b5ddab719e593d62 WatchSource:0}: Error finding container 1f7802bb10116c11f42c676d5fbd1cacfd87b9d9cf4ea7f8b5ddab719e593d62: Status 404 returned error can't find the container with id 1f7802bb10116c11f42c676d5fbd1cacfd87b9d9cf4ea7f8b5ddab719e593d62 Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.784166 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\": container with ID starting with 8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449 not found: ID does not exist" containerID="8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.784255 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449"} err="failed to get container status \"8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\": rpc error: code = NotFound desc = could not find container \"8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\": container with ID starting with 8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.784285 4804 scope.go:117] "RemoveContainer" containerID="6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.784577 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\": container with ID starting with 6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79 not found: ID does not exist" containerID="6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.784603 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79"} err="failed to get container status \"6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\": rpc error: code = NotFound desc = could not find container \"6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\": container with ID starting with 6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.784618 4804 scope.go:117] "RemoveContainer" containerID="cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44" Feb 17 13:39:32 crc kubenswrapper[4804]: E0217 13:39:32.784953 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\": container with ID starting with cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44 not found: ID does not exist" containerID="cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.784996 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44"} err="failed to get container status \"cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\": rpc error: code = NotFound desc = could not find container \"cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\": container with ID starting with cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.785014 4804 scope.go:117] "RemoveContainer" containerID="0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.785386 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96"} err="failed to get container status \"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96\": rpc error: code = NotFound desc = could not find container \"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96\": container with ID starting with 0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.785484 4804 scope.go:117] "RemoveContainer" containerID="6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.785868 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe"} err="failed to get container status \"6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe\": rpc error: code = NotFound desc = could not find container \"6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe\": container with ID starting with 6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.785892 4804 scope.go:117] "RemoveContainer" containerID="d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.786138 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd"} err="failed to get container status \"d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\": rpc error: code = NotFound desc = could not find container \"d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\": container with ID starting with d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.786320 4804 scope.go:117] "RemoveContainer" containerID="cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.787384 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054"} err="failed to get container status \"cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\": rpc error: code = NotFound desc = could not find container \"cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\": container with ID starting with cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.787408 4804 scope.go:117] "RemoveContainer" containerID="d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.787730 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb"} err="failed to get container status \"d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\": rpc error: code = NotFound desc = could not find container \"d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\": container with ID starting with d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.787762 4804 scope.go:117] "RemoveContainer" containerID="6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.788225 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7"} err="failed to get container status \"6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\": rpc error: code = NotFound desc = could not find container \"6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\": container with ID starting with 6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.788255 4804 scope.go:117] "RemoveContainer" containerID="94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.788543 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb"} err="failed to get container status \"94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\": rpc error: code = NotFound desc = could not find container \"94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\": container with ID starting with 94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.788631 4804 scope.go:117] "RemoveContainer" containerID="8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.789049 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449"} err="failed to get container status \"8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\": rpc error: code = NotFound desc = could not find container \"8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\": container with ID starting with 8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.789069 4804 scope.go:117] "RemoveContainer" containerID="6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.789431 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79"} err="failed to get container status \"6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\": rpc error: code = NotFound desc = could not find container \"6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\": container with ID starting with 6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.789532 4804 scope.go:117] "RemoveContainer" containerID="cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.789934 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44"} err="failed to get container status \"cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\": rpc error: code = NotFound desc = could not find container \"cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\": container with ID starting with cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.789957 4804 scope.go:117] "RemoveContainer" containerID="0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.790328 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96"} err="failed to get container status \"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96\": rpc error: code = NotFound desc = could not find container \"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96\": container with ID starting with 0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.790438 4804 scope.go:117] "RemoveContainer" containerID="6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.790776 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe"} err="failed to get container status \"6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe\": rpc error: code = NotFound desc = could not find container \"6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe\": container with ID starting with 6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.790802 4804 scope.go:117] "RemoveContainer" containerID="d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.791024 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd"} err="failed to get container status \"d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\": rpc error: code = NotFound desc = could not find container \"d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\": container with ID starting with d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.791044 4804 scope.go:117] "RemoveContainer" containerID="cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.791313 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054"} err="failed to get container status \"cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\": rpc error: code = NotFound desc = could not find container \"cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\": container with ID starting with cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.791345 4804 scope.go:117] "RemoveContainer" containerID="d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.791756 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb"} err="failed to get container status \"d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\": rpc error: code = NotFound desc = could not find container \"d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\": container with ID starting with d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.791804 4804 scope.go:117] "RemoveContainer" containerID="6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.792130 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7"} err="failed to get container status \"6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\": rpc error: code = NotFound desc = could not find container \"6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\": container with ID starting with 6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.792156 4804 scope.go:117] "RemoveContainer" containerID="94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.792511 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb"} err="failed to get container status \"94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\": rpc error: code = NotFound desc = could not find container \"94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\": container with ID starting with 94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.792533 4804 scope.go:117] "RemoveContainer" containerID="8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.792825 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449"} err="failed to get container status \"8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\": rpc error: code = NotFound desc = could not find container \"8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\": container with ID starting with 8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.792845 4804 scope.go:117] "RemoveContainer" containerID="6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.793187 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79"} err="failed to get container status \"6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\": rpc error: code = NotFound desc = could not find container \"6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\": container with ID starting with 6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.793277 4804 scope.go:117] "RemoveContainer" containerID="cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.793623 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44"} err="failed to get container status \"cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\": rpc error: code = NotFound desc = could not find container \"cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\": container with ID starting with cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.793648 4804 scope.go:117] "RemoveContainer" containerID="0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.793847 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96"} err="failed to get container status \"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96\": rpc error: code = NotFound desc = could not find container \"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96\": container with ID starting with 0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.793875 4804 scope.go:117] "RemoveContainer" containerID="6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.794158 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe"} err="failed to get container status \"6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe\": rpc error: code = NotFound desc = could not find container \"6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe\": container with ID starting with 6aa946aef254a8d934581e8ba2b04f445e48f2f96373096b3d40601ae603fdfe not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.794182 4804 scope.go:117] "RemoveContainer" containerID="d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.794465 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd"} err="failed to get container status \"d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\": rpc error: code = NotFound desc = could not find container \"d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd\": container with ID starting with d06292316e234bce09354d20910b6c6dd534d16c5bd9ae57b427aba18c827dcd not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.794486 4804 scope.go:117] "RemoveContainer" containerID="cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.794702 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054"} err="failed to get container status \"cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\": rpc error: code = NotFound desc = could not find container \"cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054\": container with ID starting with cd6158ec7006d93ad3ad8346968efcb1475979e9d059b17f5dda559a1da45054 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.794722 4804 scope.go:117] "RemoveContainer" containerID="d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.795025 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb"} err="failed to get container status \"d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\": rpc error: code = NotFound desc = could not find container \"d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb\": container with ID starting with d042b2aa632c4618d1a2b430077995e37fed8cdd901e9f0ec73978afcd28b9fb not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.795045 4804 scope.go:117] "RemoveContainer" containerID="6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.795308 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7"} err="failed to get container status \"6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\": rpc error: code = NotFound desc = could not find container \"6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7\": container with ID starting with 6de24ba45eb3fd6cf16fd3f638065fe478d9c7035494c04fb6160a452d49e3e7 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.795329 4804 scope.go:117] "RemoveContainer" containerID="94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.795573 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb"} err="failed to get container status \"94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\": rpc error: code = NotFound desc = could not find container \"94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb\": container with ID starting with 94080c0c52cccc2152591064c7d9f3e288050491938072cca60106b45bc2f3fb not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.795593 4804 scope.go:117] "RemoveContainer" containerID="8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.795830 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449"} err="failed to get container status \"8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\": rpc error: code = NotFound desc = could not find container \"8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449\": container with ID starting with 8d4f0f538d9cdff5696db005aee00daf5afebd21350b39a23fa053798ef74449 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.795850 4804 scope.go:117] "RemoveContainer" containerID="6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.796117 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79"} err="failed to get container status \"6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\": rpc error: code = NotFound desc = could not find container \"6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79\": container with ID starting with 6357925775d5b017779f8137a5f2245cd9da3844fcea96df27a9047ae4bc0d79 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.796142 4804 scope.go:117] "RemoveContainer" containerID="cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.796387 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44"} err="failed to get container status \"cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\": rpc error: code = NotFound desc = could not find container \"cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44\": container with ID starting with cb1ee7e58231eeeec7bc0dc22b7d8cff569db07981927c510ca398866476bc44 not found: ID does not exist" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.796406 4804 scope.go:117] "RemoveContainer" containerID="0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96" Feb 17 13:39:32 crc kubenswrapper[4804]: I0217 13:39:32.796647 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96"} err="failed to get container status \"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96\": rpc error: code = NotFound desc = could not find container \"0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96\": container with ID starting with 0f115e36e989f6728f32964062aa604b8063c076218039c0c1006d9821275a96 not found: ID does not exist" Feb 17 13:39:33 crc kubenswrapper[4804]: I0217 13:39:33.591314 4804 generic.go:334] "Generic (PLEG): container finished" podID="33fa0baa-0a4a-41c5-976e-5c7f60828272" containerID="ba73a264d7f8563e0e9c4d40dbc9af839c153a50177c18a61830fc5c8a477ad1" exitCode=0 Feb 17 13:39:33 crc kubenswrapper[4804]: I0217 13:39:33.591360 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" event={"ID":"33fa0baa-0a4a-41c5-976e-5c7f60828272","Type":"ContainerDied","Data":"ba73a264d7f8563e0e9c4d40dbc9af839c153a50177c18a61830fc5c8a477ad1"} Feb 17 13:39:33 crc kubenswrapper[4804]: I0217 13:39:33.591514 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" event={"ID":"33fa0baa-0a4a-41c5-976e-5c7f60828272","Type":"ContainerStarted","Data":"1f7802bb10116c11f42c676d5fbd1cacfd87b9d9cf4ea7f8b5ddab719e593d62"} Feb 17 13:39:33 crc kubenswrapper[4804]: I0217 13:39:33.598448 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kclvs_42eec48d-c990-43e6-8348-d9f78997ec3b/kube-multus/2.log" Feb 17 13:39:33 crc kubenswrapper[4804]: I0217 13:39:33.599023 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kclvs_42eec48d-c990-43e6-8348-d9f78997ec3b/kube-multus/1.log" Feb 17 13:39:33 crc kubenswrapper[4804]: I0217 13:39:33.599064 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kclvs" event={"ID":"42eec48d-c990-43e6-8348-d9f78997ec3b","Type":"ContainerStarted","Data":"ea4800310f7d1b18b67c74de6a007247d05f79d7da23b317f43e391c1b3ebb1f"} Feb 17 13:39:34 crc kubenswrapper[4804]: I0217 13:39:34.582547 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8df4e52a-e578-472b-a6b3-418e9755714f" path="/var/lib/kubelet/pods/8df4e52a-e578-472b-a6b3-418e9755714f/volumes" Feb 17 13:39:34 crc kubenswrapper[4804]: I0217 13:39:34.608170 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" event={"ID":"33fa0baa-0a4a-41c5-976e-5c7f60828272","Type":"ContainerStarted","Data":"1120b3852c9eb0d23a2cfc95af8fd714b16650469fba10364cb91d2c99098fa4"} Feb 17 13:39:34 crc kubenswrapper[4804]: I0217 13:39:34.608252 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" event={"ID":"33fa0baa-0a4a-41c5-976e-5c7f60828272","Type":"ContainerStarted","Data":"7e8cd1a8a21365acd0af75a8f75b47e048e01a2a3d2e2a3931a0be35f83db943"} Feb 17 13:39:34 crc kubenswrapper[4804]: I0217 13:39:34.608281 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" event={"ID":"33fa0baa-0a4a-41c5-976e-5c7f60828272","Type":"ContainerStarted","Data":"1eb432af7f00483ec2a113397eddcf290bb0a042d015c649a88ce1f5abae770e"} Feb 17 13:39:34 crc kubenswrapper[4804]: I0217 13:39:34.608300 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" event={"ID":"33fa0baa-0a4a-41c5-976e-5c7f60828272","Type":"ContainerStarted","Data":"8cedcbbd7a320662899aa0d730be4be04bd5efd746aef8638e64f29498523077"} Feb 17 13:39:34 crc kubenswrapper[4804]: I0217 13:39:34.608319 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" event={"ID":"33fa0baa-0a4a-41c5-976e-5c7f60828272","Type":"ContainerStarted","Data":"9c32de04547fb45496c5f3adb3d2e0cdde1fee35c4b4cfd2943f18c3470e5fb7"} Feb 17 13:39:34 crc kubenswrapper[4804]: I0217 13:39:34.608338 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" event={"ID":"33fa0baa-0a4a-41c5-976e-5c7f60828272","Type":"ContainerStarted","Data":"94fb2d2ab20bd81235258da1fa469aa591bf35c3d3e5b4c0f9f7d28010aaea5d"} Feb 17 13:39:36 crc kubenswrapper[4804]: I0217 13:39:36.628559 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" event={"ID":"33fa0baa-0a4a-41c5-976e-5c7f60828272","Type":"ContainerStarted","Data":"ffbfe501ece81f184e3f2ce45658eed4f8324c2345bec728ac7d73c042a28e18"} Feb 17 13:39:39 crc kubenswrapper[4804]: I0217 13:39:39.648151 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" event={"ID":"33fa0baa-0a4a-41c5-976e-5c7f60828272","Type":"ContainerStarted","Data":"790c594836962af94d21c4cf97cbf6eb00279642f06f4954e2d0e2b343b1b9e2"} Feb 17 13:39:39 crc kubenswrapper[4804]: I0217 13:39:39.648725 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:39 crc kubenswrapper[4804]: I0217 13:39:39.648799 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:39 crc kubenswrapper[4804]: I0217 13:39:39.648867 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:39 crc kubenswrapper[4804]: I0217 13:39:39.673431 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:39:39 crc kubenswrapper[4804]: I0217 13:39:39.675265 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" podStartSLOduration=7.675251498 podStartE2EDuration="7.675251498s" podCreationTimestamp="2026-02-17 13:39:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:39:39.674355949 +0000 UTC m=+853.785775296" watchObservedRunningTime="2026-02-17 13:39:39.675251498 +0000 UTC m=+853.786670835" Feb 17 13:39:39 crc kubenswrapper[4804]: I0217 13:39:39.685249 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:40:02 crc kubenswrapper[4804]: I0217 13:40:02.793344 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-q64t2" Feb 17 13:40:10 crc kubenswrapper[4804]: I0217 13:40:10.543114 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d"] Feb 17 13:40:10 crc kubenswrapper[4804]: I0217 13:40:10.544805 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d" Feb 17 13:40:10 crc kubenswrapper[4804]: I0217 13:40:10.551003 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d"] Feb 17 13:40:10 crc kubenswrapper[4804]: I0217 13:40:10.552699 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 17 13:40:10 crc kubenswrapper[4804]: I0217 13:40:10.648258 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/17c12921-34cb-4c2e-9cb8-585348e46d30-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d\" (UID: \"17c12921-34cb-4c2e-9cb8-585348e46d30\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d" Feb 17 13:40:10 crc kubenswrapper[4804]: I0217 13:40:10.648341 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/17c12921-34cb-4c2e-9cb8-585348e46d30-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d\" (UID: \"17c12921-34cb-4c2e-9cb8-585348e46d30\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d" Feb 17 13:40:10 crc kubenswrapper[4804]: I0217 13:40:10.648402 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdpsw\" (UniqueName: \"kubernetes.io/projected/17c12921-34cb-4c2e-9cb8-585348e46d30-kube-api-access-pdpsw\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d\" (UID: \"17c12921-34cb-4c2e-9cb8-585348e46d30\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d" Feb 17 13:40:10 crc kubenswrapper[4804]: I0217 13:40:10.749408 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/17c12921-34cb-4c2e-9cb8-585348e46d30-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d\" (UID: \"17c12921-34cb-4c2e-9cb8-585348e46d30\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d" Feb 17 13:40:10 crc kubenswrapper[4804]: I0217 13:40:10.749719 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/17c12921-34cb-4c2e-9cb8-585348e46d30-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d\" (UID: \"17c12921-34cb-4c2e-9cb8-585348e46d30\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d" Feb 17 13:40:10 crc kubenswrapper[4804]: I0217 13:40:10.749827 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdpsw\" (UniqueName: \"kubernetes.io/projected/17c12921-34cb-4c2e-9cb8-585348e46d30-kube-api-access-pdpsw\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d\" (UID: \"17c12921-34cb-4c2e-9cb8-585348e46d30\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d" Feb 17 13:40:10 crc kubenswrapper[4804]: I0217 13:40:10.749916 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/17c12921-34cb-4c2e-9cb8-585348e46d30-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d\" (UID: \"17c12921-34cb-4c2e-9cb8-585348e46d30\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d" Feb 17 13:40:10 crc kubenswrapper[4804]: I0217 13:40:10.750114 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/17c12921-34cb-4c2e-9cb8-585348e46d30-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d\" (UID: \"17c12921-34cb-4c2e-9cb8-585348e46d30\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d" Feb 17 13:40:10 crc kubenswrapper[4804]: I0217 13:40:10.768244 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdpsw\" (UniqueName: \"kubernetes.io/projected/17c12921-34cb-4c2e-9cb8-585348e46d30-kube-api-access-pdpsw\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d\" (UID: \"17c12921-34cb-4c2e-9cb8-585348e46d30\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d" Feb 17 13:40:10 crc kubenswrapper[4804]: I0217 13:40:10.862117 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d" Feb 17 13:40:11 crc kubenswrapper[4804]: I0217 13:40:11.078125 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d"] Feb 17 13:40:11 crc kubenswrapper[4804]: W0217 13:40:11.084662 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17c12921_34cb_4c2e_9cb8_585348e46d30.slice/crio-add656ed6eb63f18068e31561558ccb808018fa2d060b3c0283ab98423605dc9 WatchSource:0}: Error finding container add656ed6eb63f18068e31561558ccb808018fa2d060b3c0283ab98423605dc9: Status 404 returned error can't find the container with id add656ed6eb63f18068e31561558ccb808018fa2d060b3c0283ab98423605dc9 Feb 17 13:40:11 crc kubenswrapper[4804]: I0217 13:40:11.847370 4804 generic.go:334] "Generic (PLEG): container finished" podID="17c12921-34cb-4c2e-9cb8-585348e46d30" containerID="68d055d48c0beb73384b28fec6312b4794a9551a75e4790dd6949b3936abdb4b" exitCode=0 Feb 17 13:40:11 crc kubenswrapper[4804]: I0217 13:40:11.847432 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d" event={"ID":"17c12921-34cb-4c2e-9cb8-585348e46d30","Type":"ContainerDied","Data":"68d055d48c0beb73384b28fec6312b4794a9551a75e4790dd6949b3936abdb4b"} Feb 17 13:40:11 crc kubenswrapper[4804]: I0217 13:40:11.847492 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d" event={"ID":"17c12921-34cb-4c2e-9cb8-585348e46d30","Type":"ContainerStarted","Data":"add656ed6eb63f18068e31561558ccb808018fa2d060b3c0283ab98423605dc9"} Feb 17 13:40:12 crc kubenswrapper[4804]: I0217 13:40:12.730483 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m9764"] Feb 17 13:40:12 crc kubenswrapper[4804]: I0217 13:40:12.732326 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m9764" Feb 17 13:40:12 crc kubenswrapper[4804]: I0217 13:40:12.736679 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m9764"] Feb 17 13:40:12 crc kubenswrapper[4804]: I0217 13:40:12.884352 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/360df31e-5543-40bc-a507-76ce8c336d42-utilities\") pod \"redhat-operators-m9764\" (UID: \"360df31e-5543-40bc-a507-76ce8c336d42\") " pod="openshift-marketplace/redhat-operators-m9764" Feb 17 13:40:12 crc kubenswrapper[4804]: I0217 13:40:12.884426 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp5hv\" (UniqueName: \"kubernetes.io/projected/360df31e-5543-40bc-a507-76ce8c336d42-kube-api-access-sp5hv\") pod \"redhat-operators-m9764\" (UID: \"360df31e-5543-40bc-a507-76ce8c336d42\") " pod="openshift-marketplace/redhat-operators-m9764" Feb 17 13:40:12 crc kubenswrapper[4804]: I0217 13:40:12.884471 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/360df31e-5543-40bc-a507-76ce8c336d42-catalog-content\") pod \"redhat-operators-m9764\" (UID: \"360df31e-5543-40bc-a507-76ce8c336d42\") " pod="openshift-marketplace/redhat-operators-m9764" Feb 17 13:40:12 crc kubenswrapper[4804]: I0217 13:40:12.985765 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/360df31e-5543-40bc-a507-76ce8c336d42-utilities\") pod \"redhat-operators-m9764\" (UID: \"360df31e-5543-40bc-a507-76ce8c336d42\") " pod="openshift-marketplace/redhat-operators-m9764" Feb 17 13:40:12 crc kubenswrapper[4804]: I0217 13:40:12.985850 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp5hv\" (UniqueName: \"kubernetes.io/projected/360df31e-5543-40bc-a507-76ce8c336d42-kube-api-access-sp5hv\") pod \"redhat-operators-m9764\" (UID: \"360df31e-5543-40bc-a507-76ce8c336d42\") " pod="openshift-marketplace/redhat-operators-m9764" Feb 17 13:40:12 crc kubenswrapper[4804]: I0217 13:40:12.985891 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/360df31e-5543-40bc-a507-76ce8c336d42-catalog-content\") pod \"redhat-operators-m9764\" (UID: \"360df31e-5543-40bc-a507-76ce8c336d42\") " pod="openshift-marketplace/redhat-operators-m9764" Feb 17 13:40:12 crc kubenswrapper[4804]: I0217 13:40:12.986304 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/360df31e-5543-40bc-a507-76ce8c336d42-utilities\") pod \"redhat-operators-m9764\" (UID: \"360df31e-5543-40bc-a507-76ce8c336d42\") " pod="openshift-marketplace/redhat-operators-m9764" Feb 17 13:40:12 crc kubenswrapper[4804]: I0217 13:40:12.986359 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/360df31e-5543-40bc-a507-76ce8c336d42-catalog-content\") pod \"redhat-operators-m9764\" (UID: \"360df31e-5543-40bc-a507-76ce8c336d42\") " pod="openshift-marketplace/redhat-operators-m9764" Feb 17 13:40:13 crc kubenswrapper[4804]: I0217 13:40:13.005589 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp5hv\" (UniqueName: \"kubernetes.io/projected/360df31e-5543-40bc-a507-76ce8c336d42-kube-api-access-sp5hv\") pod \"redhat-operators-m9764\" (UID: \"360df31e-5543-40bc-a507-76ce8c336d42\") " pod="openshift-marketplace/redhat-operators-m9764" Feb 17 13:40:13 crc kubenswrapper[4804]: I0217 13:40:13.095194 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m9764" Feb 17 13:40:13 crc kubenswrapper[4804]: I0217 13:40:13.295250 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m9764"] Feb 17 13:40:13 crc kubenswrapper[4804]: W0217 13:40:13.360118 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod360df31e_5543_40bc_a507_76ce8c336d42.slice/crio-16a653f264128e5b4be27fbdc5d3721b713d9d72a60c5f9389716871b444025d WatchSource:0}: Error finding container 16a653f264128e5b4be27fbdc5d3721b713d9d72a60c5f9389716871b444025d: Status 404 returned error can't find the container with id 16a653f264128e5b4be27fbdc5d3721b713d9d72a60c5f9389716871b444025d Feb 17 13:40:13 crc kubenswrapper[4804]: I0217 13:40:13.862002 4804 generic.go:334] "Generic (PLEG): container finished" podID="17c12921-34cb-4c2e-9cb8-585348e46d30" containerID="18d43d9e0721b48fc2e9852ee763f7b840ccfa5191315513c3c6d7bb2544d362" exitCode=0 Feb 17 13:40:13 crc kubenswrapper[4804]: I0217 13:40:13.862072 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d" event={"ID":"17c12921-34cb-4c2e-9cb8-585348e46d30","Type":"ContainerDied","Data":"18d43d9e0721b48fc2e9852ee763f7b840ccfa5191315513c3c6d7bb2544d362"} Feb 17 13:40:13 crc kubenswrapper[4804]: I0217 13:40:13.864266 4804 generic.go:334] "Generic (PLEG): container finished" podID="360df31e-5543-40bc-a507-76ce8c336d42" containerID="a3f0ddba8829031242e38c416e1b6947c00ac8f0dcde990e7a1bede911bbbcea" exitCode=0 Feb 17 13:40:13 crc kubenswrapper[4804]: I0217 13:40:13.864320 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9764" event={"ID":"360df31e-5543-40bc-a507-76ce8c336d42","Type":"ContainerDied","Data":"a3f0ddba8829031242e38c416e1b6947c00ac8f0dcde990e7a1bede911bbbcea"} Feb 17 13:40:13 crc kubenswrapper[4804]: I0217 13:40:13.864360 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9764" event={"ID":"360df31e-5543-40bc-a507-76ce8c336d42","Type":"ContainerStarted","Data":"16a653f264128e5b4be27fbdc5d3721b713d9d72a60c5f9389716871b444025d"} Feb 17 13:40:14 crc kubenswrapper[4804]: I0217 13:40:14.872665 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9764" event={"ID":"360df31e-5543-40bc-a507-76ce8c336d42","Type":"ContainerStarted","Data":"8014bf6a8f40162fd4a662efab0d56ffe33c1d1ee72ef2020eb2bbffe3336430"} Feb 17 13:40:14 crc kubenswrapper[4804]: I0217 13:40:14.876071 4804 generic.go:334] "Generic (PLEG): container finished" podID="17c12921-34cb-4c2e-9cb8-585348e46d30" containerID="06729888763ca9ac544294c550c944e52dc5e36bdb1fa6f8de896feb8f6c3556" exitCode=0 Feb 17 13:40:14 crc kubenswrapper[4804]: I0217 13:40:14.876111 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d" event={"ID":"17c12921-34cb-4c2e-9cb8-585348e46d30","Type":"ContainerDied","Data":"06729888763ca9ac544294c550c944e52dc5e36bdb1fa6f8de896feb8f6c3556"} Feb 17 13:40:15 crc kubenswrapper[4804]: I0217 13:40:15.887308 4804 generic.go:334] "Generic (PLEG): container finished" podID="360df31e-5543-40bc-a507-76ce8c336d42" containerID="8014bf6a8f40162fd4a662efab0d56ffe33c1d1ee72ef2020eb2bbffe3336430" exitCode=0 Feb 17 13:40:15 crc kubenswrapper[4804]: I0217 13:40:15.887447 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9764" event={"ID":"360df31e-5543-40bc-a507-76ce8c336d42","Type":"ContainerDied","Data":"8014bf6a8f40162fd4a662efab0d56ffe33c1d1ee72ef2020eb2bbffe3336430"} Feb 17 13:40:16 crc kubenswrapper[4804]: I0217 13:40:16.241234 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d" Feb 17 13:40:16 crc kubenswrapper[4804]: I0217 13:40:16.338433 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdpsw\" (UniqueName: \"kubernetes.io/projected/17c12921-34cb-4c2e-9cb8-585348e46d30-kube-api-access-pdpsw\") pod \"17c12921-34cb-4c2e-9cb8-585348e46d30\" (UID: \"17c12921-34cb-4c2e-9cb8-585348e46d30\") " Feb 17 13:40:16 crc kubenswrapper[4804]: I0217 13:40:16.338538 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/17c12921-34cb-4c2e-9cb8-585348e46d30-bundle\") pod \"17c12921-34cb-4c2e-9cb8-585348e46d30\" (UID: \"17c12921-34cb-4c2e-9cb8-585348e46d30\") " Feb 17 13:40:16 crc kubenswrapper[4804]: I0217 13:40:16.338604 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/17c12921-34cb-4c2e-9cb8-585348e46d30-util\") pod \"17c12921-34cb-4c2e-9cb8-585348e46d30\" (UID: \"17c12921-34cb-4c2e-9cb8-585348e46d30\") " Feb 17 13:40:16 crc kubenswrapper[4804]: I0217 13:40:16.339965 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17c12921-34cb-4c2e-9cb8-585348e46d30-bundle" (OuterVolumeSpecName: "bundle") pod "17c12921-34cb-4c2e-9cb8-585348e46d30" (UID: "17c12921-34cb-4c2e-9cb8-585348e46d30"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:40:16 crc kubenswrapper[4804]: I0217 13:40:16.344950 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17c12921-34cb-4c2e-9cb8-585348e46d30-kube-api-access-pdpsw" (OuterVolumeSpecName: "kube-api-access-pdpsw") pod "17c12921-34cb-4c2e-9cb8-585348e46d30" (UID: "17c12921-34cb-4c2e-9cb8-585348e46d30"). InnerVolumeSpecName "kube-api-access-pdpsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:40:16 crc kubenswrapper[4804]: I0217 13:40:16.353953 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17c12921-34cb-4c2e-9cb8-585348e46d30-util" (OuterVolumeSpecName: "util") pod "17c12921-34cb-4c2e-9cb8-585348e46d30" (UID: "17c12921-34cb-4c2e-9cb8-585348e46d30"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:40:16 crc kubenswrapper[4804]: I0217 13:40:16.440716 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdpsw\" (UniqueName: \"kubernetes.io/projected/17c12921-34cb-4c2e-9cb8-585348e46d30-kube-api-access-pdpsw\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:16 crc kubenswrapper[4804]: I0217 13:40:16.440759 4804 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/17c12921-34cb-4c2e-9cb8-585348e46d30-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:16 crc kubenswrapper[4804]: I0217 13:40:16.440773 4804 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/17c12921-34cb-4c2e-9cb8-585348e46d30-util\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:16 crc kubenswrapper[4804]: I0217 13:40:16.898768 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d" Feb 17 13:40:16 crc kubenswrapper[4804]: I0217 13:40:16.898781 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d" event={"ID":"17c12921-34cb-4c2e-9cb8-585348e46d30","Type":"ContainerDied","Data":"add656ed6eb63f18068e31561558ccb808018fa2d060b3c0283ab98423605dc9"} Feb 17 13:40:16 crc kubenswrapper[4804]: I0217 13:40:16.899577 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="add656ed6eb63f18068e31561558ccb808018fa2d060b3c0283ab98423605dc9" Feb 17 13:40:16 crc kubenswrapper[4804]: I0217 13:40:16.904238 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9764" event={"ID":"360df31e-5543-40bc-a507-76ce8c336d42","Type":"ContainerStarted","Data":"6be395ccaf608cf25496bcfd2047bfa6bcc858d9f7201d4320f1887050b566f2"} Feb 17 13:40:16 crc kubenswrapper[4804]: I0217 13:40:16.927350 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m9764" podStartSLOduration=2.449612494 podStartE2EDuration="4.9273252s" podCreationTimestamp="2026-02-17 13:40:12 +0000 UTC" firstStartedPulling="2026-02-17 13:40:13.86593815 +0000 UTC m=+887.977357497" lastFinishedPulling="2026-02-17 13:40:16.343650856 +0000 UTC m=+890.455070203" observedRunningTime="2026-02-17 13:40:16.924635616 +0000 UTC m=+891.036054983" watchObservedRunningTime="2026-02-17 13:40:16.9273252 +0000 UTC m=+891.038744577" Feb 17 13:40:18 crc kubenswrapper[4804]: I0217 13:40:18.217881 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-rkf7s"] Feb 17 13:40:18 crc kubenswrapper[4804]: E0217 13:40:18.218095 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17c12921-34cb-4c2e-9cb8-585348e46d30" containerName="util" Feb 17 13:40:18 crc kubenswrapper[4804]: I0217 13:40:18.218106 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="17c12921-34cb-4c2e-9cb8-585348e46d30" containerName="util" Feb 17 13:40:18 crc kubenswrapper[4804]: E0217 13:40:18.218117 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17c12921-34cb-4c2e-9cb8-585348e46d30" containerName="extract" Feb 17 13:40:18 crc kubenswrapper[4804]: I0217 13:40:18.218122 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="17c12921-34cb-4c2e-9cb8-585348e46d30" containerName="extract" Feb 17 13:40:18 crc kubenswrapper[4804]: E0217 13:40:18.218136 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17c12921-34cb-4c2e-9cb8-585348e46d30" containerName="pull" Feb 17 13:40:18 crc kubenswrapper[4804]: I0217 13:40:18.218142 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="17c12921-34cb-4c2e-9cb8-585348e46d30" containerName="pull" Feb 17 13:40:18 crc kubenswrapper[4804]: I0217 13:40:18.218250 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="17c12921-34cb-4c2e-9cb8-585348e46d30" containerName="extract" Feb 17 13:40:18 crc kubenswrapper[4804]: I0217 13:40:18.218623 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-rkf7s" Feb 17 13:40:18 crc kubenswrapper[4804]: I0217 13:40:18.220795 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 17 13:40:18 crc kubenswrapper[4804]: I0217 13:40:18.221130 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 17 13:40:18 crc kubenswrapper[4804]: I0217 13:40:18.221348 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-qjhp9" Feb 17 13:40:18 crc kubenswrapper[4804]: I0217 13:40:18.231379 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-rkf7s"] Feb 17 13:40:18 crc kubenswrapper[4804]: I0217 13:40:18.238321 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nskmj\" (UniqueName: \"kubernetes.io/projected/2789dcb9-5619-4986-a692-1eec733c97ff-kube-api-access-nskmj\") pod \"nmstate-operator-694c9596b7-rkf7s\" (UID: \"2789dcb9-5619-4986-a692-1eec733c97ff\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-rkf7s" Feb 17 13:40:18 crc kubenswrapper[4804]: I0217 13:40:18.339397 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nskmj\" (UniqueName: \"kubernetes.io/projected/2789dcb9-5619-4986-a692-1eec733c97ff-kube-api-access-nskmj\") pod \"nmstate-operator-694c9596b7-rkf7s\" (UID: \"2789dcb9-5619-4986-a692-1eec733c97ff\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-rkf7s" Feb 17 13:40:18 crc kubenswrapper[4804]: I0217 13:40:18.356945 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nskmj\" (UniqueName: \"kubernetes.io/projected/2789dcb9-5619-4986-a692-1eec733c97ff-kube-api-access-nskmj\") pod \"nmstate-operator-694c9596b7-rkf7s\" (UID: \"2789dcb9-5619-4986-a692-1eec733c97ff\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-rkf7s" Feb 17 13:40:18 crc kubenswrapper[4804]: I0217 13:40:18.533979 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-rkf7s" Feb 17 13:40:18 crc kubenswrapper[4804]: I0217 13:40:18.801154 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-rkf7s"] Feb 17 13:40:18 crc kubenswrapper[4804]: W0217 13:40:18.801544 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2789dcb9_5619_4986_a692_1eec733c97ff.slice/crio-d1116ffa2c4c1997ac70e9ce631b59a870b4642f589532dcd559cf475b2613fa WatchSource:0}: Error finding container d1116ffa2c4c1997ac70e9ce631b59a870b4642f589532dcd559cf475b2613fa: Status 404 returned error can't find the container with id d1116ffa2c4c1997ac70e9ce631b59a870b4642f589532dcd559cf475b2613fa Feb 17 13:40:18 crc kubenswrapper[4804]: I0217 13:40:18.915279 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-rkf7s" event={"ID":"2789dcb9-5619-4986-a692-1eec733c97ff","Type":"ContainerStarted","Data":"d1116ffa2c4c1997ac70e9ce631b59a870b4642f589532dcd559cf475b2613fa"} Feb 17 13:40:20 crc kubenswrapper[4804]: I0217 13:40:20.919110 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m57vw"] Feb 17 13:40:20 crc kubenswrapper[4804]: I0217 13:40:20.920360 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m57vw" Feb 17 13:40:20 crc kubenswrapper[4804]: I0217 13:40:20.929682 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m57vw"] Feb 17 13:40:21 crc kubenswrapper[4804]: I0217 13:40:21.015985 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e364f68f-7e6e-4f69-8884-19064e2ab186-catalog-content\") pod \"redhat-marketplace-m57vw\" (UID: \"e364f68f-7e6e-4f69-8884-19064e2ab186\") " pod="openshift-marketplace/redhat-marketplace-m57vw" Feb 17 13:40:21 crc kubenswrapper[4804]: I0217 13:40:21.016090 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e364f68f-7e6e-4f69-8884-19064e2ab186-utilities\") pod \"redhat-marketplace-m57vw\" (UID: \"e364f68f-7e6e-4f69-8884-19064e2ab186\") " pod="openshift-marketplace/redhat-marketplace-m57vw" Feb 17 13:40:21 crc kubenswrapper[4804]: I0217 13:40:21.016116 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvhln\" (UniqueName: \"kubernetes.io/projected/e364f68f-7e6e-4f69-8884-19064e2ab186-kube-api-access-mvhln\") pod \"redhat-marketplace-m57vw\" (UID: \"e364f68f-7e6e-4f69-8884-19064e2ab186\") " pod="openshift-marketplace/redhat-marketplace-m57vw" Feb 17 13:40:21 crc kubenswrapper[4804]: I0217 13:40:21.116769 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e364f68f-7e6e-4f69-8884-19064e2ab186-utilities\") pod \"redhat-marketplace-m57vw\" (UID: \"e364f68f-7e6e-4f69-8884-19064e2ab186\") " pod="openshift-marketplace/redhat-marketplace-m57vw" Feb 17 13:40:21 crc kubenswrapper[4804]: I0217 13:40:21.116817 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvhln\" (UniqueName: \"kubernetes.io/projected/e364f68f-7e6e-4f69-8884-19064e2ab186-kube-api-access-mvhln\") pod \"redhat-marketplace-m57vw\" (UID: \"e364f68f-7e6e-4f69-8884-19064e2ab186\") " pod="openshift-marketplace/redhat-marketplace-m57vw" Feb 17 13:40:21 crc kubenswrapper[4804]: I0217 13:40:21.116861 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e364f68f-7e6e-4f69-8884-19064e2ab186-catalog-content\") pod \"redhat-marketplace-m57vw\" (UID: \"e364f68f-7e6e-4f69-8884-19064e2ab186\") " pod="openshift-marketplace/redhat-marketplace-m57vw" Feb 17 13:40:21 crc kubenswrapper[4804]: I0217 13:40:21.117408 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e364f68f-7e6e-4f69-8884-19064e2ab186-catalog-content\") pod \"redhat-marketplace-m57vw\" (UID: \"e364f68f-7e6e-4f69-8884-19064e2ab186\") " pod="openshift-marketplace/redhat-marketplace-m57vw" Feb 17 13:40:21 crc kubenswrapper[4804]: I0217 13:40:21.117401 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e364f68f-7e6e-4f69-8884-19064e2ab186-utilities\") pod \"redhat-marketplace-m57vw\" (UID: \"e364f68f-7e6e-4f69-8884-19064e2ab186\") " pod="openshift-marketplace/redhat-marketplace-m57vw" Feb 17 13:40:21 crc kubenswrapper[4804]: I0217 13:40:21.140686 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvhln\" (UniqueName: \"kubernetes.io/projected/e364f68f-7e6e-4f69-8884-19064e2ab186-kube-api-access-mvhln\") pod \"redhat-marketplace-m57vw\" (UID: \"e364f68f-7e6e-4f69-8884-19064e2ab186\") " pod="openshift-marketplace/redhat-marketplace-m57vw" Feb 17 13:40:21 crc kubenswrapper[4804]: I0217 13:40:21.236034 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m57vw" Feb 17 13:40:21 crc kubenswrapper[4804]: I0217 13:40:21.479295 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m57vw"] Feb 17 13:40:21 crc kubenswrapper[4804]: W0217 13:40:21.492311 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode364f68f_7e6e_4f69_8884_19064e2ab186.slice/crio-555bccc3d2d7e40f2defa172b9440725c7c1ecdee867fd2abee771e51caa9692 WatchSource:0}: Error finding container 555bccc3d2d7e40f2defa172b9440725c7c1ecdee867fd2abee771e51caa9692: Status 404 returned error can't find the container with id 555bccc3d2d7e40f2defa172b9440725c7c1ecdee867fd2abee771e51caa9692 Feb 17 13:40:21 crc kubenswrapper[4804]: I0217 13:40:21.961207 4804 generic.go:334] "Generic (PLEG): container finished" podID="e364f68f-7e6e-4f69-8884-19064e2ab186" containerID="a58356e342b8d1a0c197b929d754c94eace180ca8295bdab19e683e521269b3f" exitCode=0 Feb 17 13:40:21 crc kubenswrapper[4804]: I0217 13:40:21.961292 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m57vw" event={"ID":"e364f68f-7e6e-4f69-8884-19064e2ab186","Type":"ContainerDied","Data":"a58356e342b8d1a0c197b929d754c94eace180ca8295bdab19e683e521269b3f"} Feb 17 13:40:21 crc kubenswrapper[4804]: I0217 13:40:21.961336 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m57vw" event={"ID":"e364f68f-7e6e-4f69-8884-19064e2ab186","Type":"ContainerStarted","Data":"555bccc3d2d7e40f2defa172b9440725c7c1ecdee867fd2abee771e51caa9692"} Feb 17 13:40:21 crc kubenswrapper[4804]: I0217 13:40:21.962502 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-rkf7s" event={"ID":"2789dcb9-5619-4986-a692-1eec733c97ff","Type":"ContainerStarted","Data":"42d70f2666785e518cfdf425959617cb4a8bf3f12a5125e26182af3c2af1ec42"} Feb 17 13:40:21 crc kubenswrapper[4804]: I0217 13:40:21.996551 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-694c9596b7-rkf7s" podStartSLOduration=1.414879656 podStartE2EDuration="3.996534006s" podCreationTimestamp="2026-02-17 13:40:18 +0000 UTC" firstStartedPulling="2026-02-17 13:40:18.80375031 +0000 UTC m=+892.915169647" lastFinishedPulling="2026-02-17 13:40:21.38540466 +0000 UTC m=+895.496823997" observedRunningTime="2026-02-17 13:40:21.995614807 +0000 UTC m=+896.107034154" watchObservedRunningTime="2026-02-17 13:40:21.996534006 +0000 UTC m=+896.107953343" Feb 17 13:40:22 crc kubenswrapper[4804]: I0217 13:40:22.952995 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-8gkbz"] Feb 17 13:40:22 crc kubenswrapper[4804]: I0217 13:40:22.954370 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-8gkbz" Feb 17 13:40:22 crc kubenswrapper[4804]: I0217 13:40:22.956258 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-tslwf" Feb 17 13:40:22 crc kubenswrapper[4804]: I0217 13:40:22.969499 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-dbfqz"] Feb 17 13:40:22 crc kubenswrapper[4804]: I0217 13:40:22.970244 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-dbfqz" Feb 17 13:40:22 crc kubenswrapper[4804]: I0217 13:40:22.976561 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 17 13:40:22 crc kubenswrapper[4804]: I0217 13:40:22.977021 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-8gkbz"] Feb 17 13:40:22 crc kubenswrapper[4804]: I0217 13:40:22.982695 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m57vw" event={"ID":"e364f68f-7e6e-4f69-8884-19064e2ab186","Type":"ContainerStarted","Data":"4c6c05689b4d8003c577d2fa36fd3fe297914eaa29a6a636dc47b237ac9d795d"} Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.009137 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-dbfqz"] Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.016586 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-jxn7r"] Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.017436 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-jxn7r" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.091346 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bgf7w"] Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.092007 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bgf7w" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.095603 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m9764" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.095818 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.095827 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.096165 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m9764" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.096845 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-hb7n4" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.100878 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bgf7w"] Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.140042 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6prfp\" (UniqueName: \"kubernetes.io/projected/36fd4ae3-048e-4e51-b2fa-875a5c84b8e0-kube-api-access-6prfp\") pod \"nmstate-webhook-866bcb46dc-dbfqz\" (UID: \"36fd4ae3-048e-4e51-b2fa-875a5c84b8e0\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-dbfqz" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.140114 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbmh5\" (UniqueName: \"kubernetes.io/projected/81e46a71-360c-4509-ad38-2b2c814a56c2-kube-api-access-xbmh5\") pod \"nmstate-handler-jxn7r\" (UID: \"81e46a71-360c-4509-ad38-2b2c814a56c2\") " pod="openshift-nmstate/nmstate-handler-jxn7r" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.140138 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/81e46a71-360c-4509-ad38-2b2c814a56c2-dbus-socket\") pod \"nmstate-handler-jxn7r\" (UID: \"81e46a71-360c-4509-ad38-2b2c814a56c2\") " pod="openshift-nmstate/nmstate-handler-jxn7r" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.140179 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54gfw\" (UniqueName: \"kubernetes.io/projected/18e3c061-8633-471f-b2ab-e87e3c0b5d44-kube-api-access-54gfw\") pod \"nmstate-metrics-58c85c668d-8gkbz\" (UID: \"18e3c061-8633-471f-b2ab-e87e3c0b5d44\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-8gkbz" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.140223 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/81e46a71-360c-4509-ad38-2b2c814a56c2-nmstate-lock\") pod \"nmstate-handler-jxn7r\" (UID: \"81e46a71-360c-4509-ad38-2b2c814a56c2\") " pod="openshift-nmstate/nmstate-handler-jxn7r" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.140236 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/81e46a71-360c-4509-ad38-2b2c814a56c2-ovs-socket\") pod \"nmstate-handler-jxn7r\" (UID: \"81e46a71-360c-4509-ad38-2b2c814a56c2\") " pod="openshift-nmstate/nmstate-handler-jxn7r" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.140253 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/36fd4ae3-048e-4e51-b2fa-875a5c84b8e0-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-dbfqz\" (UID: \"36fd4ae3-048e-4e51-b2fa-875a5c84b8e0\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-dbfqz" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.241292 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6prfp\" (UniqueName: \"kubernetes.io/projected/36fd4ae3-048e-4e51-b2fa-875a5c84b8e0-kube-api-access-6prfp\") pod \"nmstate-webhook-866bcb46dc-dbfqz\" (UID: \"36fd4ae3-048e-4e51-b2fa-875a5c84b8e0\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-dbfqz" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.241347 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbmh5\" (UniqueName: \"kubernetes.io/projected/81e46a71-360c-4509-ad38-2b2c814a56c2-kube-api-access-xbmh5\") pod \"nmstate-handler-jxn7r\" (UID: \"81e46a71-360c-4509-ad38-2b2c814a56c2\") " pod="openshift-nmstate/nmstate-handler-jxn7r" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.241381 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/81e46a71-360c-4509-ad38-2b2c814a56c2-dbus-socket\") pod \"nmstate-handler-jxn7r\" (UID: \"81e46a71-360c-4509-ad38-2b2c814a56c2\") " pod="openshift-nmstate/nmstate-handler-jxn7r" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.241409 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2158c202-5aa4-47aa-87a1-73e4b9043e78-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-bgf7w\" (UID: \"2158c202-5aa4-47aa-87a1-73e4b9043e78\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bgf7w" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.241441 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhrn6\" (UniqueName: \"kubernetes.io/projected/2158c202-5aa4-47aa-87a1-73e4b9043e78-kube-api-access-jhrn6\") pod \"nmstate-console-plugin-5c78fc5d65-bgf7w\" (UID: \"2158c202-5aa4-47aa-87a1-73e4b9043e78\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bgf7w" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.241461 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2158c202-5aa4-47aa-87a1-73e4b9043e78-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-bgf7w\" (UID: \"2158c202-5aa4-47aa-87a1-73e4b9043e78\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bgf7w" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.241483 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54gfw\" (UniqueName: \"kubernetes.io/projected/18e3c061-8633-471f-b2ab-e87e3c0b5d44-kube-api-access-54gfw\") pod \"nmstate-metrics-58c85c668d-8gkbz\" (UID: \"18e3c061-8633-471f-b2ab-e87e3c0b5d44\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-8gkbz" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.241503 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/81e46a71-360c-4509-ad38-2b2c814a56c2-ovs-socket\") pod \"nmstate-handler-jxn7r\" (UID: \"81e46a71-360c-4509-ad38-2b2c814a56c2\") " pod="openshift-nmstate/nmstate-handler-jxn7r" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.241517 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/81e46a71-360c-4509-ad38-2b2c814a56c2-nmstate-lock\") pod \"nmstate-handler-jxn7r\" (UID: \"81e46a71-360c-4509-ad38-2b2c814a56c2\") " pod="openshift-nmstate/nmstate-handler-jxn7r" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.241534 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/36fd4ae3-048e-4e51-b2fa-875a5c84b8e0-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-dbfqz\" (UID: \"36fd4ae3-048e-4e51-b2fa-875a5c84b8e0\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-dbfqz" Feb 17 13:40:23 crc kubenswrapper[4804]: E0217 13:40:23.241653 4804 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Feb 17 13:40:23 crc kubenswrapper[4804]: E0217 13:40:23.241704 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36fd4ae3-048e-4e51-b2fa-875a5c84b8e0-tls-key-pair podName:36fd4ae3-048e-4e51-b2fa-875a5c84b8e0 nodeName:}" failed. No retries permitted until 2026-02-17 13:40:23.741686338 +0000 UTC m=+897.853105675 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/36fd4ae3-048e-4e51-b2fa-875a5c84b8e0-tls-key-pair") pod "nmstate-webhook-866bcb46dc-dbfqz" (UID: "36fd4ae3-048e-4e51-b2fa-875a5c84b8e0") : secret "openshift-nmstate-webhook" not found Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.242018 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/81e46a71-360c-4509-ad38-2b2c814a56c2-dbus-socket\") pod \"nmstate-handler-jxn7r\" (UID: \"81e46a71-360c-4509-ad38-2b2c814a56c2\") " pod="openshift-nmstate/nmstate-handler-jxn7r" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.242023 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/81e46a71-360c-4509-ad38-2b2c814a56c2-ovs-socket\") pod \"nmstate-handler-jxn7r\" (UID: \"81e46a71-360c-4509-ad38-2b2c814a56c2\") " pod="openshift-nmstate/nmstate-handler-jxn7r" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.242040 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/81e46a71-360c-4509-ad38-2b2c814a56c2-nmstate-lock\") pod \"nmstate-handler-jxn7r\" (UID: \"81e46a71-360c-4509-ad38-2b2c814a56c2\") " pod="openshift-nmstate/nmstate-handler-jxn7r" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.265956 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54gfw\" (UniqueName: \"kubernetes.io/projected/18e3c061-8633-471f-b2ab-e87e3c0b5d44-kube-api-access-54gfw\") pod \"nmstate-metrics-58c85c668d-8gkbz\" (UID: \"18e3c061-8633-471f-b2ab-e87e3c0b5d44\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-8gkbz" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.274833 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6prfp\" (UniqueName: \"kubernetes.io/projected/36fd4ae3-048e-4e51-b2fa-875a5c84b8e0-kube-api-access-6prfp\") pod \"nmstate-webhook-866bcb46dc-dbfqz\" (UID: \"36fd4ae3-048e-4e51-b2fa-875a5c84b8e0\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-dbfqz" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.279954 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbmh5\" (UniqueName: \"kubernetes.io/projected/81e46a71-360c-4509-ad38-2b2c814a56c2-kube-api-access-xbmh5\") pod \"nmstate-handler-jxn7r\" (UID: \"81e46a71-360c-4509-ad38-2b2c814a56c2\") " pod="openshift-nmstate/nmstate-handler-jxn7r" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.306961 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-8gkbz" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.315372 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-664d7fb4-tx445"] Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.315997 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.331814 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-664d7fb4-tx445"] Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.342405 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2158c202-5aa4-47aa-87a1-73e4b9043e78-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-bgf7w\" (UID: \"2158c202-5aa4-47aa-87a1-73e4b9043e78\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bgf7w" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.342453 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhrn6\" (UniqueName: \"kubernetes.io/projected/2158c202-5aa4-47aa-87a1-73e4b9043e78-kube-api-access-jhrn6\") pod \"nmstate-console-plugin-5c78fc5d65-bgf7w\" (UID: \"2158c202-5aa4-47aa-87a1-73e4b9043e78\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bgf7w" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.342473 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2158c202-5aa4-47aa-87a1-73e4b9043e78-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-bgf7w\" (UID: \"2158c202-5aa4-47aa-87a1-73e4b9043e78\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bgf7w" Feb 17 13:40:23 crc kubenswrapper[4804]: E0217 13:40:23.343357 4804 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Feb 17 13:40:23 crc kubenswrapper[4804]: E0217 13:40:23.343412 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2158c202-5aa4-47aa-87a1-73e4b9043e78-plugin-serving-cert podName:2158c202-5aa4-47aa-87a1-73e4b9043e78 nodeName:}" failed. No retries permitted until 2026-02-17 13:40:23.843399201 +0000 UTC m=+897.954818538 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/2158c202-5aa4-47aa-87a1-73e4b9043e78-plugin-serving-cert") pod "nmstate-console-plugin-5c78fc5d65-bgf7w" (UID: "2158c202-5aa4-47aa-87a1-73e4b9043e78") : secret "plugin-serving-cert" not found Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.343356 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2158c202-5aa4-47aa-87a1-73e4b9043e78-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-bgf7w\" (UID: \"2158c202-5aa4-47aa-87a1-73e4b9043e78\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bgf7w" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.363657 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhrn6\" (UniqueName: \"kubernetes.io/projected/2158c202-5aa4-47aa-87a1-73e4b9043e78-kube-api-access-jhrn6\") pod \"nmstate-console-plugin-5c78fc5d65-bgf7w\" (UID: \"2158c202-5aa4-47aa-87a1-73e4b9043e78\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bgf7w" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.376326 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-jxn7r" Feb 17 13:40:23 crc kubenswrapper[4804]: W0217 13:40:23.406262 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81e46a71_360c_4509_ad38_2b2c814a56c2.slice/crio-4e562ae07992c27e95ae60ad52fa3c88ff88ed29c452794cce3b4f066dd6e271 WatchSource:0}: Error finding container 4e562ae07992c27e95ae60ad52fa3c88ff88ed29c452794cce3b4f066dd6e271: Status 404 returned error can't find the container with id 4e562ae07992c27e95ae60ad52fa3c88ff88ed29c452794cce3b4f066dd6e271 Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.443800 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/60243734-ea5d-4197-bb21-b278641ce101-service-ca\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.443849 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/60243734-ea5d-4197-bb21-b278641ce101-oauth-serving-cert\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.443874 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60243734-ea5d-4197-bb21-b278641ce101-trusted-ca-bundle\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.443901 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/60243734-ea5d-4197-bb21-b278641ce101-console-oauth-config\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.443919 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/60243734-ea5d-4197-bb21-b278641ce101-console-serving-cert\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.443942 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/60243734-ea5d-4197-bb21-b278641ce101-console-config\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.444047 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdzzh\" (UniqueName: \"kubernetes.io/projected/60243734-ea5d-4197-bb21-b278641ce101-kube-api-access-pdzzh\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.545646 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdzzh\" (UniqueName: \"kubernetes.io/projected/60243734-ea5d-4197-bb21-b278641ce101-kube-api-access-pdzzh\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.545688 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/60243734-ea5d-4197-bb21-b278641ce101-service-ca\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.545711 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/60243734-ea5d-4197-bb21-b278641ce101-oauth-serving-cert\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.545737 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60243734-ea5d-4197-bb21-b278641ce101-trusted-ca-bundle\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.545767 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/60243734-ea5d-4197-bb21-b278641ce101-console-oauth-config\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.545788 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/60243734-ea5d-4197-bb21-b278641ce101-console-serving-cert\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.545816 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/60243734-ea5d-4197-bb21-b278641ce101-console-config\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.546834 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/60243734-ea5d-4197-bb21-b278641ce101-oauth-serving-cert\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.546836 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/60243734-ea5d-4197-bb21-b278641ce101-service-ca\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.546977 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/60243734-ea5d-4197-bb21-b278641ce101-console-config\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.546983 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60243734-ea5d-4197-bb21-b278641ce101-trusted-ca-bundle\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.549764 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/60243734-ea5d-4197-bb21-b278641ce101-console-oauth-config\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.549763 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/60243734-ea5d-4197-bb21-b278641ce101-console-serving-cert\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.564878 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdzzh\" (UniqueName: \"kubernetes.io/projected/60243734-ea5d-4197-bb21-b278641ce101-kube-api-access-pdzzh\") pod \"console-664d7fb4-tx445\" (UID: \"60243734-ea5d-4197-bb21-b278641ce101\") " pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:23 crc kubenswrapper[4804]: I0217 13:40:23.715807 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:24 crc kubenswrapper[4804]: I0217 13:40:23.816889 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/36fd4ae3-048e-4e51-b2fa-875a5c84b8e0-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-dbfqz\" (UID: \"36fd4ae3-048e-4e51-b2fa-875a5c84b8e0\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-dbfqz" Feb 17 13:40:24 crc kubenswrapper[4804]: I0217 13:40:23.821333 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/36fd4ae3-048e-4e51-b2fa-875a5c84b8e0-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-dbfqz\" (UID: \"36fd4ae3-048e-4e51-b2fa-875a5c84b8e0\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-dbfqz" Feb 17 13:40:24 crc kubenswrapper[4804]: I0217 13:40:23.917022 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-dbfqz" Feb 17 13:40:24 crc kubenswrapper[4804]: I0217 13:40:23.918122 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2158c202-5aa4-47aa-87a1-73e4b9043e78-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-bgf7w\" (UID: \"2158c202-5aa4-47aa-87a1-73e4b9043e78\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bgf7w" Feb 17 13:40:24 crc kubenswrapper[4804]: I0217 13:40:23.921409 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2158c202-5aa4-47aa-87a1-73e4b9043e78-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-bgf7w\" (UID: \"2158c202-5aa4-47aa-87a1-73e4b9043e78\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bgf7w" Feb 17 13:40:24 crc kubenswrapper[4804]: I0217 13:40:24.003264 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-jxn7r" event={"ID":"81e46a71-360c-4509-ad38-2b2c814a56c2","Type":"ContainerStarted","Data":"4e562ae07992c27e95ae60ad52fa3c88ff88ed29c452794cce3b4f066dd6e271"} Feb 17 13:40:24 crc kubenswrapper[4804]: I0217 13:40:24.008882 4804 generic.go:334] "Generic (PLEG): container finished" podID="e364f68f-7e6e-4f69-8884-19064e2ab186" containerID="4c6c05689b4d8003c577d2fa36fd3fe297914eaa29a6a636dc47b237ac9d795d" exitCode=0 Feb 17 13:40:24 crc kubenswrapper[4804]: I0217 13:40:24.008939 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m57vw" event={"ID":"e364f68f-7e6e-4f69-8884-19064e2ab186","Type":"ContainerDied","Data":"4c6c05689b4d8003c577d2fa36fd3fe297914eaa29a6a636dc47b237ac9d795d"} Feb 17 13:40:24 crc kubenswrapper[4804]: I0217 13:40:24.031162 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bgf7w" Feb 17 13:40:24 crc kubenswrapper[4804]: I0217 13:40:24.180778 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m9764" podUID="360df31e-5543-40bc-a507-76ce8c336d42" containerName="registry-server" probeResult="failure" output=< Feb 17 13:40:24 crc kubenswrapper[4804]: timeout: failed to connect service ":50051" within 1s Feb 17 13:40:24 crc kubenswrapper[4804]: > Feb 17 13:40:24 crc kubenswrapper[4804]: I0217 13:40:24.917019 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-dbfqz"] Feb 17 13:40:24 crc kubenswrapper[4804]: W0217 13:40:24.923882 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36fd4ae3_048e_4e51_b2fa_875a5c84b8e0.slice/crio-45494291fea5bd3e5eb853983ba116cc565cd64cbe47f955397a881c761c35a4 WatchSource:0}: Error finding container 45494291fea5bd3e5eb853983ba116cc565cd64cbe47f955397a881c761c35a4: Status 404 returned error can't find the container with id 45494291fea5bd3e5eb853983ba116cc565cd64cbe47f955397a881c761c35a4 Feb 17 13:40:24 crc kubenswrapper[4804]: I0217 13:40:24.923971 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bgf7w"] Feb 17 13:40:24 crc kubenswrapper[4804]: I0217 13:40:24.933089 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-8gkbz"] Feb 17 13:40:24 crc kubenswrapper[4804]: W0217 13:40:24.940511 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2158c202_5aa4_47aa_87a1_73e4b9043e78.slice/crio-0efd4639dcfc5a591a8986a76edabe50af15fdb2fdd8dd5fe7e17a2b498c5133 WatchSource:0}: Error finding container 0efd4639dcfc5a591a8986a76edabe50af15fdb2fdd8dd5fe7e17a2b498c5133: Status 404 returned error can't find the container with id 0efd4639dcfc5a591a8986a76edabe50af15fdb2fdd8dd5fe7e17a2b498c5133 Feb 17 13:40:24 crc kubenswrapper[4804]: W0217 13:40:24.941694 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60243734_ea5d_4197_bb21_b278641ce101.slice/crio-9335d26640ddcadd8362aa79b933fcb7772c51b9729fff2c800b7c9b8c9f51cc WatchSource:0}: Error finding container 9335d26640ddcadd8362aa79b933fcb7772c51b9729fff2c800b7c9b8c9f51cc: Status 404 returned error can't find the container with id 9335d26640ddcadd8362aa79b933fcb7772c51b9729fff2c800b7c9b8c9f51cc Feb 17 13:40:24 crc kubenswrapper[4804]: I0217 13:40:24.947067 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-664d7fb4-tx445"] Feb 17 13:40:24 crc kubenswrapper[4804]: W0217 13:40:24.950812 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18e3c061_8633_471f_b2ab_e87e3c0b5d44.slice/crio-914e17fae991465764dc2e7a640446413fc2a0859bae032a3d56e04df3246b26 WatchSource:0}: Error finding container 914e17fae991465764dc2e7a640446413fc2a0859bae032a3d56e04df3246b26: Status 404 returned error can't find the container with id 914e17fae991465764dc2e7a640446413fc2a0859bae032a3d56e04df3246b26 Feb 17 13:40:25 crc kubenswrapper[4804]: I0217 13:40:25.015945 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-8gkbz" event={"ID":"18e3c061-8633-471f-b2ab-e87e3c0b5d44","Type":"ContainerStarted","Data":"914e17fae991465764dc2e7a640446413fc2a0859bae032a3d56e04df3246b26"} Feb 17 13:40:25 crc kubenswrapper[4804]: I0217 13:40:25.017022 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-dbfqz" event={"ID":"36fd4ae3-048e-4e51-b2fa-875a5c84b8e0","Type":"ContainerStarted","Data":"45494291fea5bd3e5eb853983ba116cc565cd64cbe47f955397a881c761c35a4"} Feb 17 13:40:25 crc kubenswrapper[4804]: I0217 13:40:25.018460 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-664d7fb4-tx445" event={"ID":"60243734-ea5d-4197-bb21-b278641ce101","Type":"ContainerStarted","Data":"9335d26640ddcadd8362aa79b933fcb7772c51b9729fff2c800b7c9b8c9f51cc"} Feb 17 13:40:25 crc kubenswrapper[4804]: I0217 13:40:25.019789 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bgf7w" event={"ID":"2158c202-5aa4-47aa-87a1-73e4b9043e78","Type":"ContainerStarted","Data":"0efd4639dcfc5a591a8986a76edabe50af15fdb2fdd8dd5fe7e17a2b498c5133"} Feb 17 13:40:25 crc kubenswrapper[4804]: I0217 13:40:25.022211 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m57vw" event={"ID":"e364f68f-7e6e-4f69-8884-19064e2ab186","Type":"ContainerStarted","Data":"936d92768f8545882fd9f589c352b0f3e05694fdb88b93635d612b3de2273f31"} Feb 17 13:40:25 crc kubenswrapper[4804]: I0217 13:40:25.046775 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m57vw" podStartSLOduration=2.568216374 podStartE2EDuration="5.046755747s" podCreationTimestamp="2026-02-17 13:40:20 +0000 UTC" firstStartedPulling="2026-02-17 13:40:21.96384416 +0000 UTC m=+896.075263507" lastFinishedPulling="2026-02-17 13:40:24.442383543 +0000 UTC m=+898.553802880" observedRunningTime="2026-02-17 13:40:25.041834632 +0000 UTC m=+899.153253969" watchObservedRunningTime="2026-02-17 13:40:25.046755747 +0000 UTC m=+899.158175104" Feb 17 13:40:26 crc kubenswrapper[4804]: I0217 13:40:26.030660 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-664d7fb4-tx445" event={"ID":"60243734-ea5d-4197-bb21-b278641ce101","Type":"ContainerStarted","Data":"d029fc37d28f0fc9736310fd30cf0f3429d4e245f42e37fc11827000f84680bc"} Feb 17 13:40:26 crc kubenswrapper[4804]: I0217 13:40:26.050578 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-664d7fb4-tx445" podStartSLOduration=3.050557821 podStartE2EDuration="3.050557821s" podCreationTimestamp="2026-02-17 13:40:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:40:26.047495295 +0000 UTC m=+900.158914632" watchObservedRunningTime="2026-02-17 13:40:26.050557821 +0000 UTC m=+900.161977158" Feb 17 13:40:26 crc kubenswrapper[4804]: I0217 13:40:26.861050 4804 scope.go:117] "RemoveContainer" containerID="2df81c301857cf67d2883ea019d2d5fbd31c4c5dea2d9b5c4bfb19302b4cb03a" Feb 17 13:40:27 crc kubenswrapper[4804]: I0217 13:40:27.038352 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kclvs_42eec48d-c990-43e6-8348-d9f78997ec3b/kube-multus/2.log" Feb 17 13:40:28 crc kubenswrapper[4804]: I0217 13:40:28.058691 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-jxn7r" event={"ID":"81e46a71-360c-4509-ad38-2b2c814a56c2","Type":"ContainerStarted","Data":"a961e2afe58699decd28bc4b065d6f984dea427bbcb5c3f9dc9ebee7cb470db6"} Feb 17 13:40:28 crc kubenswrapper[4804]: I0217 13:40:28.059441 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-jxn7r" Feb 17 13:40:28 crc kubenswrapper[4804]: I0217 13:40:28.060124 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-8gkbz" event={"ID":"18e3c061-8633-471f-b2ab-e87e3c0b5d44","Type":"ContainerStarted","Data":"a9398a7bb08979780f96eaf493af71ddbd2a07b2910e645679824d3047d6cfc2"} Feb 17 13:40:28 crc kubenswrapper[4804]: I0217 13:40:28.069080 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-dbfqz" event={"ID":"36fd4ae3-048e-4e51-b2fa-875a5c84b8e0","Type":"ContainerStarted","Data":"49e7222b3b541d9076fae97cf42fc4edcaa1361f71fa5f57527a9556124e5884"} Feb 17 13:40:28 crc kubenswrapper[4804]: I0217 13:40:28.069357 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-dbfqz" Feb 17 13:40:28 crc kubenswrapper[4804]: I0217 13:40:28.080435 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-jxn7r" podStartSLOduration=2.559791299 podStartE2EDuration="6.080416697s" podCreationTimestamp="2026-02-17 13:40:22 +0000 UTC" firstStartedPulling="2026-02-17 13:40:23.415321129 +0000 UTC m=+897.526740466" lastFinishedPulling="2026-02-17 13:40:26.935946517 +0000 UTC m=+901.047365864" observedRunningTime="2026-02-17 13:40:28.072460857 +0000 UTC m=+902.183880204" watchObservedRunningTime="2026-02-17 13:40:28.080416697 +0000 UTC m=+902.191836034" Feb 17 13:40:28 crc kubenswrapper[4804]: I0217 13:40:28.091540 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-dbfqz" podStartSLOduration=3.905836657 podStartE2EDuration="6.091515816s" podCreationTimestamp="2026-02-17 13:40:22 +0000 UTC" firstStartedPulling="2026-02-17 13:40:24.925946684 +0000 UTC m=+899.037366031" lastFinishedPulling="2026-02-17 13:40:27.111625853 +0000 UTC m=+901.223045190" observedRunningTime="2026-02-17 13:40:28.088430709 +0000 UTC m=+902.199850056" watchObservedRunningTime="2026-02-17 13:40:28.091515816 +0000 UTC m=+902.202935153" Feb 17 13:40:29 crc kubenswrapper[4804]: I0217 13:40:29.076482 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bgf7w" event={"ID":"2158c202-5aa4-47aa-87a1-73e4b9043e78","Type":"ContainerStarted","Data":"6ca5b5650b9eeab96ea3c8a13f711527a677e9d1df0164b708cdad34f6ce0e7b"} Feb 17 13:40:29 crc kubenswrapper[4804]: I0217 13:40:29.093361 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bgf7w" podStartSLOduration=2.210334902 podStartE2EDuration="6.093329957s" podCreationTimestamp="2026-02-17 13:40:23 +0000 UTC" firstStartedPulling="2026-02-17 13:40:24.944977412 +0000 UTC m=+899.056396749" lastFinishedPulling="2026-02-17 13:40:28.827972467 +0000 UTC m=+902.939391804" observedRunningTime="2026-02-17 13:40:29.091926332 +0000 UTC m=+903.203345669" watchObservedRunningTime="2026-02-17 13:40:29.093329957 +0000 UTC m=+903.204749294" Feb 17 13:40:30 crc kubenswrapper[4804]: I0217 13:40:30.084752 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-8gkbz" event={"ID":"18e3c061-8633-471f-b2ab-e87e3c0b5d44","Type":"ContainerStarted","Data":"08b91bb4e93da5f46156e9709ce0c854133d52014bfb7c7cd1a5329e6addded0"} Feb 17 13:40:30 crc kubenswrapper[4804]: I0217 13:40:30.111357 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58c85c668d-8gkbz" podStartSLOduration=3.233420036 podStartE2EDuration="8.111338376s" podCreationTimestamp="2026-02-17 13:40:22 +0000 UTC" firstStartedPulling="2026-02-17 13:40:24.960573701 +0000 UTC m=+899.071993038" lastFinishedPulling="2026-02-17 13:40:29.838492041 +0000 UTC m=+903.949911378" observedRunningTime="2026-02-17 13:40:30.107940839 +0000 UTC m=+904.219360216" watchObservedRunningTime="2026-02-17 13:40:30.111338376 +0000 UTC m=+904.222757733" Feb 17 13:40:31 crc kubenswrapper[4804]: I0217 13:40:31.237002 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m57vw" Feb 17 13:40:31 crc kubenswrapper[4804]: I0217 13:40:31.237501 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m57vw" Feb 17 13:40:31 crc kubenswrapper[4804]: I0217 13:40:31.297353 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m57vw" Feb 17 13:40:32 crc kubenswrapper[4804]: I0217 13:40:32.156460 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m57vw" Feb 17 13:40:32 crc kubenswrapper[4804]: I0217 13:40:32.202890 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m57vw"] Feb 17 13:40:33 crc kubenswrapper[4804]: I0217 13:40:33.149493 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m9764" Feb 17 13:40:33 crc kubenswrapper[4804]: I0217 13:40:33.201951 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m9764" Feb 17 13:40:33 crc kubenswrapper[4804]: I0217 13:40:33.418644 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-jxn7r" Feb 17 13:40:33 crc kubenswrapper[4804]: I0217 13:40:33.716005 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:33 crc kubenswrapper[4804]: I0217 13:40:33.716077 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:33 crc kubenswrapper[4804]: I0217 13:40:33.720778 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:33 crc kubenswrapper[4804]: I0217 13:40:33.933183 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m9764"] Feb 17 13:40:34 crc kubenswrapper[4804]: I0217 13:40:34.119573 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-m57vw" podUID="e364f68f-7e6e-4f69-8884-19064e2ab186" containerName="registry-server" containerID="cri-o://936d92768f8545882fd9f589c352b0f3e05694fdb88b93635d612b3de2273f31" gracePeriod=2 Feb 17 13:40:34 crc kubenswrapper[4804]: I0217 13:40:34.130575 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-664d7fb4-tx445" Feb 17 13:40:34 crc kubenswrapper[4804]: I0217 13:40:34.209554 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-tz5vz"] Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.133058 4804 generic.go:334] "Generic (PLEG): container finished" podID="e364f68f-7e6e-4f69-8884-19064e2ab186" containerID="936d92768f8545882fd9f589c352b0f3e05694fdb88b93635d612b3de2273f31" exitCode=0 Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.133149 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m57vw" event={"ID":"e364f68f-7e6e-4f69-8884-19064e2ab186","Type":"ContainerDied","Data":"936d92768f8545882fd9f589c352b0f3e05694fdb88b93635d612b3de2273f31"} Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.134448 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m57vw" event={"ID":"e364f68f-7e6e-4f69-8884-19064e2ab186","Type":"ContainerDied","Data":"555bccc3d2d7e40f2defa172b9440725c7c1ecdee867fd2abee771e51caa9692"} Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.134499 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="555bccc3d2d7e40f2defa172b9440725c7c1ecdee867fd2abee771e51caa9692" Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.134630 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-m9764" podUID="360df31e-5543-40bc-a507-76ce8c336d42" containerName="registry-server" containerID="cri-o://6be395ccaf608cf25496bcfd2047bfa6bcc858d9f7201d4320f1887050b566f2" gracePeriod=2 Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.143711 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m57vw" Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.269691 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e364f68f-7e6e-4f69-8884-19064e2ab186-utilities\") pod \"e364f68f-7e6e-4f69-8884-19064e2ab186\" (UID: \"e364f68f-7e6e-4f69-8884-19064e2ab186\") " Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.269769 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e364f68f-7e6e-4f69-8884-19064e2ab186-catalog-content\") pod \"e364f68f-7e6e-4f69-8884-19064e2ab186\" (UID: \"e364f68f-7e6e-4f69-8884-19064e2ab186\") " Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.269801 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvhln\" (UniqueName: \"kubernetes.io/projected/e364f68f-7e6e-4f69-8884-19064e2ab186-kube-api-access-mvhln\") pod \"e364f68f-7e6e-4f69-8884-19064e2ab186\" (UID: \"e364f68f-7e6e-4f69-8884-19064e2ab186\") " Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.270775 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e364f68f-7e6e-4f69-8884-19064e2ab186-utilities" (OuterVolumeSpecName: "utilities") pod "e364f68f-7e6e-4f69-8884-19064e2ab186" (UID: "e364f68f-7e6e-4f69-8884-19064e2ab186"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.278130 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e364f68f-7e6e-4f69-8884-19064e2ab186-kube-api-access-mvhln" (OuterVolumeSpecName: "kube-api-access-mvhln") pod "e364f68f-7e6e-4f69-8884-19064e2ab186" (UID: "e364f68f-7e6e-4f69-8884-19064e2ab186"). InnerVolumeSpecName "kube-api-access-mvhln". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.296786 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e364f68f-7e6e-4f69-8884-19064e2ab186-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e364f68f-7e6e-4f69-8884-19064e2ab186" (UID: "e364f68f-7e6e-4f69-8884-19064e2ab186"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.372387 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e364f68f-7e6e-4f69-8884-19064e2ab186-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.372427 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e364f68f-7e6e-4f69-8884-19064e2ab186-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.372440 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvhln\" (UniqueName: \"kubernetes.io/projected/e364f68f-7e6e-4f69-8884-19064e2ab186-kube-api-access-mvhln\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.485477 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m9764" Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.573955 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/360df31e-5543-40bc-a507-76ce8c336d42-utilities\") pod \"360df31e-5543-40bc-a507-76ce8c336d42\" (UID: \"360df31e-5543-40bc-a507-76ce8c336d42\") " Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.574027 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/360df31e-5543-40bc-a507-76ce8c336d42-catalog-content\") pod \"360df31e-5543-40bc-a507-76ce8c336d42\" (UID: \"360df31e-5543-40bc-a507-76ce8c336d42\") " Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.574114 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sp5hv\" (UniqueName: \"kubernetes.io/projected/360df31e-5543-40bc-a507-76ce8c336d42-kube-api-access-sp5hv\") pod \"360df31e-5543-40bc-a507-76ce8c336d42\" (UID: \"360df31e-5543-40bc-a507-76ce8c336d42\") " Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.575406 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/360df31e-5543-40bc-a507-76ce8c336d42-utilities" (OuterVolumeSpecName: "utilities") pod "360df31e-5543-40bc-a507-76ce8c336d42" (UID: "360df31e-5543-40bc-a507-76ce8c336d42"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.578450 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/360df31e-5543-40bc-a507-76ce8c336d42-kube-api-access-sp5hv" (OuterVolumeSpecName: "kube-api-access-sp5hv") pod "360df31e-5543-40bc-a507-76ce8c336d42" (UID: "360df31e-5543-40bc-a507-76ce8c336d42"). InnerVolumeSpecName "kube-api-access-sp5hv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.676028 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/360df31e-5543-40bc-a507-76ce8c336d42-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.676266 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sp5hv\" (UniqueName: \"kubernetes.io/projected/360df31e-5543-40bc-a507-76ce8c336d42-kube-api-access-sp5hv\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.693069 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/360df31e-5543-40bc-a507-76ce8c336d42-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "360df31e-5543-40bc-a507-76ce8c336d42" (UID: "360df31e-5543-40bc-a507-76ce8c336d42"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:40:35 crc kubenswrapper[4804]: I0217 13:40:35.777413 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/360df31e-5543-40bc-a507-76ce8c336d42-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:36 crc kubenswrapper[4804]: I0217 13:40:36.149266 4804 generic.go:334] "Generic (PLEG): container finished" podID="360df31e-5543-40bc-a507-76ce8c336d42" containerID="6be395ccaf608cf25496bcfd2047bfa6bcc858d9f7201d4320f1887050b566f2" exitCode=0 Feb 17 13:40:36 crc kubenswrapper[4804]: I0217 13:40:36.149368 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m57vw" Feb 17 13:40:36 crc kubenswrapper[4804]: I0217 13:40:36.149390 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m9764" Feb 17 13:40:36 crc kubenswrapper[4804]: I0217 13:40:36.149401 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9764" event={"ID":"360df31e-5543-40bc-a507-76ce8c336d42","Type":"ContainerDied","Data":"6be395ccaf608cf25496bcfd2047bfa6bcc858d9f7201d4320f1887050b566f2"} Feb 17 13:40:36 crc kubenswrapper[4804]: I0217 13:40:36.149510 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9764" event={"ID":"360df31e-5543-40bc-a507-76ce8c336d42","Type":"ContainerDied","Data":"16a653f264128e5b4be27fbdc5d3721b713d9d72a60c5f9389716871b444025d"} Feb 17 13:40:36 crc kubenswrapper[4804]: I0217 13:40:36.149538 4804 scope.go:117] "RemoveContainer" containerID="6be395ccaf608cf25496bcfd2047bfa6bcc858d9f7201d4320f1887050b566f2" Feb 17 13:40:36 crc kubenswrapper[4804]: I0217 13:40:36.178737 4804 scope.go:117] "RemoveContainer" containerID="8014bf6a8f40162fd4a662efab0d56ffe33c1d1ee72ef2020eb2bbffe3336430" Feb 17 13:40:36 crc kubenswrapper[4804]: I0217 13:40:36.196413 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m9764"] Feb 17 13:40:36 crc kubenswrapper[4804]: I0217 13:40:36.207303 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-m9764"] Feb 17 13:40:36 crc kubenswrapper[4804]: I0217 13:40:36.212321 4804 scope.go:117] "RemoveContainer" containerID="a3f0ddba8829031242e38c416e1b6947c00ac8f0dcde990e7a1bede911bbbcea" Feb 17 13:40:36 crc kubenswrapper[4804]: I0217 13:40:36.217400 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m57vw"] Feb 17 13:40:36 crc kubenswrapper[4804]: I0217 13:40:36.220754 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-m57vw"] Feb 17 13:40:36 crc kubenswrapper[4804]: I0217 13:40:36.230858 4804 scope.go:117] "RemoveContainer" containerID="6be395ccaf608cf25496bcfd2047bfa6bcc858d9f7201d4320f1887050b566f2" Feb 17 13:40:36 crc kubenswrapper[4804]: E0217 13:40:36.231927 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6be395ccaf608cf25496bcfd2047bfa6bcc858d9f7201d4320f1887050b566f2\": container with ID starting with 6be395ccaf608cf25496bcfd2047bfa6bcc858d9f7201d4320f1887050b566f2 not found: ID does not exist" containerID="6be395ccaf608cf25496bcfd2047bfa6bcc858d9f7201d4320f1887050b566f2" Feb 17 13:40:36 crc kubenswrapper[4804]: I0217 13:40:36.232001 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6be395ccaf608cf25496bcfd2047bfa6bcc858d9f7201d4320f1887050b566f2"} err="failed to get container status \"6be395ccaf608cf25496bcfd2047bfa6bcc858d9f7201d4320f1887050b566f2\": rpc error: code = NotFound desc = could not find container \"6be395ccaf608cf25496bcfd2047bfa6bcc858d9f7201d4320f1887050b566f2\": container with ID starting with 6be395ccaf608cf25496bcfd2047bfa6bcc858d9f7201d4320f1887050b566f2 not found: ID does not exist" Feb 17 13:40:36 crc kubenswrapper[4804]: I0217 13:40:36.232066 4804 scope.go:117] "RemoveContainer" containerID="8014bf6a8f40162fd4a662efab0d56ffe33c1d1ee72ef2020eb2bbffe3336430" Feb 17 13:40:36 crc kubenswrapper[4804]: E0217 13:40:36.232710 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8014bf6a8f40162fd4a662efab0d56ffe33c1d1ee72ef2020eb2bbffe3336430\": container with ID starting with 8014bf6a8f40162fd4a662efab0d56ffe33c1d1ee72ef2020eb2bbffe3336430 not found: ID does not exist" containerID="8014bf6a8f40162fd4a662efab0d56ffe33c1d1ee72ef2020eb2bbffe3336430" Feb 17 13:40:36 crc kubenswrapper[4804]: I0217 13:40:36.232749 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8014bf6a8f40162fd4a662efab0d56ffe33c1d1ee72ef2020eb2bbffe3336430"} err="failed to get container status \"8014bf6a8f40162fd4a662efab0d56ffe33c1d1ee72ef2020eb2bbffe3336430\": rpc error: code = NotFound desc = could not find container \"8014bf6a8f40162fd4a662efab0d56ffe33c1d1ee72ef2020eb2bbffe3336430\": container with ID starting with 8014bf6a8f40162fd4a662efab0d56ffe33c1d1ee72ef2020eb2bbffe3336430 not found: ID does not exist" Feb 17 13:40:36 crc kubenswrapper[4804]: I0217 13:40:36.232786 4804 scope.go:117] "RemoveContainer" containerID="a3f0ddba8829031242e38c416e1b6947c00ac8f0dcde990e7a1bede911bbbcea" Feb 17 13:40:36 crc kubenswrapper[4804]: E0217 13:40:36.233376 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3f0ddba8829031242e38c416e1b6947c00ac8f0dcde990e7a1bede911bbbcea\": container with ID starting with a3f0ddba8829031242e38c416e1b6947c00ac8f0dcde990e7a1bede911bbbcea not found: ID does not exist" containerID="a3f0ddba8829031242e38c416e1b6947c00ac8f0dcde990e7a1bede911bbbcea" Feb 17 13:40:36 crc kubenswrapper[4804]: I0217 13:40:36.233423 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3f0ddba8829031242e38c416e1b6947c00ac8f0dcde990e7a1bede911bbbcea"} err="failed to get container status \"a3f0ddba8829031242e38c416e1b6947c00ac8f0dcde990e7a1bede911bbbcea\": rpc error: code = NotFound desc = could not find container \"a3f0ddba8829031242e38c416e1b6947c00ac8f0dcde990e7a1bede911bbbcea\": container with ID starting with a3f0ddba8829031242e38c416e1b6947c00ac8f0dcde990e7a1bede911bbbcea not found: ID does not exist" Feb 17 13:40:36 crc kubenswrapper[4804]: I0217 13:40:36.581549 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="360df31e-5543-40bc-a507-76ce8c336d42" path="/var/lib/kubelet/pods/360df31e-5543-40bc-a507-76ce8c336d42/volumes" Feb 17 13:40:36 crc kubenswrapper[4804]: I0217 13:40:36.582124 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e364f68f-7e6e-4f69-8884-19064e2ab186" path="/var/lib/kubelet/pods/e364f68f-7e6e-4f69-8884-19064e2ab186/volumes" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.339564 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8bzvf"] Feb 17 13:40:39 crc kubenswrapper[4804]: E0217 13:40:39.340149 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="360df31e-5543-40bc-a507-76ce8c336d42" containerName="extract-utilities" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.340161 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="360df31e-5543-40bc-a507-76ce8c336d42" containerName="extract-utilities" Feb 17 13:40:39 crc kubenswrapper[4804]: E0217 13:40:39.340175 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e364f68f-7e6e-4f69-8884-19064e2ab186" containerName="extract-utilities" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.340181 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="e364f68f-7e6e-4f69-8884-19064e2ab186" containerName="extract-utilities" Feb 17 13:40:39 crc kubenswrapper[4804]: E0217 13:40:39.340191 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e364f68f-7e6e-4f69-8884-19064e2ab186" containerName="extract-content" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.340217 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="e364f68f-7e6e-4f69-8884-19064e2ab186" containerName="extract-content" Feb 17 13:40:39 crc kubenswrapper[4804]: E0217 13:40:39.340229 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e364f68f-7e6e-4f69-8884-19064e2ab186" containerName="registry-server" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.340235 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="e364f68f-7e6e-4f69-8884-19064e2ab186" containerName="registry-server" Feb 17 13:40:39 crc kubenswrapper[4804]: E0217 13:40:39.340244 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="360df31e-5543-40bc-a507-76ce8c336d42" containerName="extract-content" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.340249 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="360df31e-5543-40bc-a507-76ce8c336d42" containerName="extract-content" Feb 17 13:40:39 crc kubenswrapper[4804]: E0217 13:40:39.340256 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="360df31e-5543-40bc-a507-76ce8c336d42" containerName="registry-server" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.340262 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="360df31e-5543-40bc-a507-76ce8c336d42" containerName="registry-server" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.340398 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="360df31e-5543-40bc-a507-76ce8c336d42" containerName="registry-server" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.340407 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="e364f68f-7e6e-4f69-8884-19064e2ab186" containerName="registry-server" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.341261 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8bzvf" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.358129 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8bzvf"] Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.427995 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ed30e2e-0f9b-4fa9-8ba1-049ae539875c-catalog-content\") pod \"community-operators-8bzvf\" (UID: \"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c\") " pod="openshift-marketplace/community-operators-8bzvf" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.428147 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gc4t\" (UniqueName: \"kubernetes.io/projected/5ed30e2e-0f9b-4fa9-8ba1-049ae539875c-kube-api-access-9gc4t\") pod \"community-operators-8bzvf\" (UID: \"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c\") " pod="openshift-marketplace/community-operators-8bzvf" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.428179 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ed30e2e-0f9b-4fa9-8ba1-049ae539875c-utilities\") pod \"community-operators-8bzvf\" (UID: \"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c\") " pod="openshift-marketplace/community-operators-8bzvf" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.529117 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ed30e2e-0f9b-4fa9-8ba1-049ae539875c-catalog-content\") pod \"community-operators-8bzvf\" (UID: \"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c\") " pod="openshift-marketplace/community-operators-8bzvf" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.529178 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gc4t\" (UniqueName: \"kubernetes.io/projected/5ed30e2e-0f9b-4fa9-8ba1-049ae539875c-kube-api-access-9gc4t\") pod \"community-operators-8bzvf\" (UID: \"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c\") " pod="openshift-marketplace/community-operators-8bzvf" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.529218 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ed30e2e-0f9b-4fa9-8ba1-049ae539875c-utilities\") pod \"community-operators-8bzvf\" (UID: \"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c\") " pod="openshift-marketplace/community-operators-8bzvf" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.529716 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ed30e2e-0f9b-4fa9-8ba1-049ae539875c-utilities\") pod \"community-operators-8bzvf\" (UID: \"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c\") " pod="openshift-marketplace/community-operators-8bzvf" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.529955 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ed30e2e-0f9b-4fa9-8ba1-049ae539875c-catalog-content\") pod \"community-operators-8bzvf\" (UID: \"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c\") " pod="openshift-marketplace/community-operators-8bzvf" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.551798 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gc4t\" (UniqueName: \"kubernetes.io/projected/5ed30e2e-0f9b-4fa9-8ba1-049ae539875c-kube-api-access-9gc4t\") pod \"community-operators-8bzvf\" (UID: \"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c\") " pod="openshift-marketplace/community-operators-8bzvf" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.675718 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8bzvf" Feb 17 13:40:39 crc kubenswrapper[4804]: I0217 13:40:39.969117 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8bzvf"] Feb 17 13:40:40 crc kubenswrapper[4804]: I0217 13:40:40.180468 4804 generic.go:334] "Generic (PLEG): container finished" podID="5ed30e2e-0f9b-4fa9-8ba1-049ae539875c" containerID="85b9580c0a64570024cd4eea5cf9d7ed8c24206dba24a4136fbb7a1a33535695" exitCode=0 Feb 17 13:40:40 crc kubenswrapper[4804]: I0217 13:40:40.180519 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8bzvf" event={"ID":"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c","Type":"ContainerDied","Data":"85b9580c0a64570024cd4eea5cf9d7ed8c24206dba24a4136fbb7a1a33535695"} Feb 17 13:40:40 crc kubenswrapper[4804]: I0217 13:40:40.180544 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8bzvf" event={"ID":"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c","Type":"ContainerStarted","Data":"7c939a790c69d09c4cd698a95d3c6e66cbf9bcb5e1dee342b73c64ad91892bab"} Feb 17 13:40:41 crc kubenswrapper[4804]: I0217 13:40:41.194859 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8bzvf" event={"ID":"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c","Type":"ContainerStarted","Data":"214edfbeaf443e51042bde372413712d591bc156f060011f8652b530692849d5"} Feb 17 13:40:42 crc kubenswrapper[4804]: I0217 13:40:42.203303 4804 generic.go:334] "Generic (PLEG): container finished" podID="5ed30e2e-0f9b-4fa9-8ba1-049ae539875c" containerID="214edfbeaf443e51042bde372413712d591bc156f060011f8652b530692849d5" exitCode=0 Feb 17 13:40:42 crc kubenswrapper[4804]: I0217 13:40:42.203420 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8bzvf" event={"ID":"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c","Type":"ContainerDied","Data":"214edfbeaf443e51042bde372413712d591bc156f060011f8652b530692849d5"} Feb 17 13:40:42 crc kubenswrapper[4804]: I0217 13:40:42.203907 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8bzvf" event={"ID":"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c","Type":"ContainerStarted","Data":"488179401e4fcdd5e9b5dc80865809b02744a9f4a1b8ec2955ee17a68c812fbe"} Feb 17 13:40:42 crc kubenswrapper[4804]: I0217 13:40:42.223506 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8bzvf" podStartSLOduration=1.829693815 podStartE2EDuration="3.223485862s" podCreationTimestamp="2026-02-17 13:40:39 +0000 UTC" firstStartedPulling="2026-02-17 13:40:40.182245289 +0000 UTC m=+914.293664636" lastFinishedPulling="2026-02-17 13:40:41.576037306 +0000 UTC m=+915.687456683" observedRunningTime="2026-02-17 13:40:42.222665456 +0000 UTC m=+916.334084793" watchObservedRunningTime="2026-02-17 13:40:42.223485862 +0000 UTC m=+916.334905199" Feb 17 13:40:43 crc kubenswrapper[4804]: I0217 13:40:43.928816 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-dbfqz" Feb 17 13:40:44 crc kubenswrapper[4804]: I0217 13:40:44.141096 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-d2786"] Feb 17 13:40:44 crc kubenswrapper[4804]: I0217 13:40:44.142662 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d2786" Feb 17 13:40:44 crc kubenswrapper[4804]: I0217 13:40:44.161144 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d2786"] Feb 17 13:40:44 crc kubenswrapper[4804]: I0217 13:40:44.200848 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d13c70f-ee22-4434-ae7a-92e62c3caa26-catalog-content\") pod \"certified-operators-d2786\" (UID: \"3d13c70f-ee22-4434-ae7a-92e62c3caa26\") " pod="openshift-marketplace/certified-operators-d2786" Feb 17 13:40:44 crc kubenswrapper[4804]: I0217 13:40:44.200926 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzz5c\" (UniqueName: \"kubernetes.io/projected/3d13c70f-ee22-4434-ae7a-92e62c3caa26-kube-api-access-pzz5c\") pod \"certified-operators-d2786\" (UID: \"3d13c70f-ee22-4434-ae7a-92e62c3caa26\") " pod="openshift-marketplace/certified-operators-d2786" Feb 17 13:40:44 crc kubenswrapper[4804]: I0217 13:40:44.201115 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d13c70f-ee22-4434-ae7a-92e62c3caa26-utilities\") pod \"certified-operators-d2786\" (UID: \"3d13c70f-ee22-4434-ae7a-92e62c3caa26\") " pod="openshift-marketplace/certified-operators-d2786" Feb 17 13:40:44 crc kubenswrapper[4804]: I0217 13:40:44.302216 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d13c70f-ee22-4434-ae7a-92e62c3caa26-utilities\") pod \"certified-operators-d2786\" (UID: \"3d13c70f-ee22-4434-ae7a-92e62c3caa26\") " pod="openshift-marketplace/certified-operators-d2786" Feb 17 13:40:44 crc kubenswrapper[4804]: I0217 13:40:44.302308 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d13c70f-ee22-4434-ae7a-92e62c3caa26-catalog-content\") pod \"certified-operators-d2786\" (UID: \"3d13c70f-ee22-4434-ae7a-92e62c3caa26\") " pod="openshift-marketplace/certified-operators-d2786" Feb 17 13:40:44 crc kubenswrapper[4804]: I0217 13:40:44.302352 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzz5c\" (UniqueName: \"kubernetes.io/projected/3d13c70f-ee22-4434-ae7a-92e62c3caa26-kube-api-access-pzz5c\") pod \"certified-operators-d2786\" (UID: \"3d13c70f-ee22-4434-ae7a-92e62c3caa26\") " pod="openshift-marketplace/certified-operators-d2786" Feb 17 13:40:44 crc kubenswrapper[4804]: I0217 13:40:44.302804 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d13c70f-ee22-4434-ae7a-92e62c3caa26-utilities\") pod \"certified-operators-d2786\" (UID: \"3d13c70f-ee22-4434-ae7a-92e62c3caa26\") " pod="openshift-marketplace/certified-operators-d2786" Feb 17 13:40:44 crc kubenswrapper[4804]: I0217 13:40:44.302831 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d13c70f-ee22-4434-ae7a-92e62c3caa26-catalog-content\") pod \"certified-operators-d2786\" (UID: \"3d13c70f-ee22-4434-ae7a-92e62c3caa26\") " pod="openshift-marketplace/certified-operators-d2786" Feb 17 13:40:44 crc kubenswrapper[4804]: I0217 13:40:44.328377 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzz5c\" (UniqueName: \"kubernetes.io/projected/3d13c70f-ee22-4434-ae7a-92e62c3caa26-kube-api-access-pzz5c\") pod \"certified-operators-d2786\" (UID: \"3d13c70f-ee22-4434-ae7a-92e62c3caa26\") " pod="openshift-marketplace/certified-operators-d2786" Feb 17 13:40:44 crc kubenswrapper[4804]: I0217 13:40:44.459891 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d2786" Feb 17 13:40:44 crc kubenswrapper[4804]: I0217 13:40:44.923114 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d2786"] Feb 17 13:40:45 crc kubenswrapper[4804]: I0217 13:40:45.223234 4804 generic.go:334] "Generic (PLEG): container finished" podID="3d13c70f-ee22-4434-ae7a-92e62c3caa26" containerID="2841989cfb43995c971c4405cddc2c9830da84b3d169d1f91be7e47313003065" exitCode=0 Feb 17 13:40:45 crc kubenswrapper[4804]: I0217 13:40:45.223288 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2786" event={"ID":"3d13c70f-ee22-4434-ae7a-92e62c3caa26","Type":"ContainerDied","Data":"2841989cfb43995c971c4405cddc2c9830da84b3d169d1f91be7e47313003065"} Feb 17 13:40:45 crc kubenswrapper[4804]: I0217 13:40:45.223317 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2786" event={"ID":"3d13c70f-ee22-4434-ae7a-92e62c3caa26","Type":"ContainerStarted","Data":"2d5cbab8cf904e1f2afff630660ba9ad4d8260633fdac34c04024ed3278b2e02"} Feb 17 13:40:46 crc kubenswrapper[4804]: I0217 13:40:46.232550 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2786" event={"ID":"3d13c70f-ee22-4434-ae7a-92e62c3caa26","Type":"ContainerStarted","Data":"1fe5b1d23d0bb27a90ac079b9dc22996fe723a550aa2c5391daa2ed178e26f28"} Feb 17 13:40:47 crc kubenswrapper[4804]: I0217 13:40:47.240177 4804 generic.go:334] "Generic (PLEG): container finished" podID="3d13c70f-ee22-4434-ae7a-92e62c3caa26" containerID="1fe5b1d23d0bb27a90ac079b9dc22996fe723a550aa2c5391daa2ed178e26f28" exitCode=0 Feb 17 13:40:47 crc kubenswrapper[4804]: I0217 13:40:47.240236 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2786" event={"ID":"3d13c70f-ee22-4434-ae7a-92e62c3caa26","Type":"ContainerDied","Data":"1fe5b1d23d0bb27a90ac079b9dc22996fe723a550aa2c5391daa2ed178e26f28"} Feb 17 13:40:48 crc kubenswrapper[4804]: I0217 13:40:48.248001 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2786" event={"ID":"3d13c70f-ee22-4434-ae7a-92e62c3caa26","Type":"ContainerStarted","Data":"56ef89b633db8972f2e239aff7c04b1a035f5ce59b613d27884a8c95e1120457"} Feb 17 13:40:48 crc kubenswrapper[4804]: I0217 13:40:48.271328 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-d2786" podStartSLOduration=1.850627993 podStartE2EDuration="4.271307979s" podCreationTimestamp="2026-02-17 13:40:44 +0000 UTC" firstStartedPulling="2026-02-17 13:40:45.225774967 +0000 UTC m=+919.337194314" lastFinishedPulling="2026-02-17 13:40:47.646454963 +0000 UTC m=+921.757874300" observedRunningTime="2026-02-17 13:40:48.266697005 +0000 UTC m=+922.378116382" watchObservedRunningTime="2026-02-17 13:40:48.271307979 +0000 UTC m=+922.382727336" Feb 17 13:40:49 crc kubenswrapper[4804]: I0217 13:40:49.676578 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8bzvf" Feb 17 13:40:49 crc kubenswrapper[4804]: I0217 13:40:49.676860 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8bzvf" Feb 17 13:40:49 crc kubenswrapper[4804]: I0217 13:40:49.716431 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8bzvf" Feb 17 13:40:50 crc kubenswrapper[4804]: I0217 13:40:50.311401 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8bzvf" Feb 17 13:40:50 crc kubenswrapper[4804]: I0217 13:40:50.731233 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8bzvf"] Feb 17 13:40:52 crc kubenswrapper[4804]: I0217 13:40:52.288641 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8bzvf" podUID="5ed30e2e-0f9b-4fa9-8ba1-049ae539875c" containerName="registry-server" containerID="cri-o://488179401e4fcdd5e9b5dc80865809b02744a9f4a1b8ec2955ee17a68c812fbe" gracePeriod=2 Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.247982 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8bzvf" Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.294749 4804 generic.go:334] "Generic (PLEG): container finished" podID="5ed30e2e-0f9b-4fa9-8ba1-049ae539875c" containerID="488179401e4fcdd5e9b5dc80865809b02744a9f4a1b8ec2955ee17a68c812fbe" exitCode=0 Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.294797 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8bzvf" event={"ID":"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c","Type":"ContainerDied","Data":"488179401e4fcdd5e9b5dc80865809b02744a9f4a1b8ec2955ee17a68c812fbe"} Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.294820 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8bzvf" Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.294830 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8bzvf" event={"ID":"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c","Type":"ContainerDied","Data":"7c939a790c69d09c4cd698a95d3c6e66cbf9bcb5e1dee342b73c64ad91892bab"} Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.294854 4804 scope.go:117] "RemoveContainer" containerID="488179401e4fcdd5e9b5dc80865809b02744a9f4a1b8ec2955ee17a68c812fbe" Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.311648 4804 scope.go:117] "RemoveContainer" containerID="214edfbeaf443e51042bde372413712d591bc156f060011f8652b530692849d5" Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.325992 4804 scope.go:117] "RemoveContainer" containerID="85b9580c0a64570024cd4eea5cf9d7ed8c24206dba24a4136fbb7a1a33535695" Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.340064 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ed30e2e-0f9b-4fa9-8ba1-049ae539875c-utilities\") pod \"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c\" (UID: \"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c\") " Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.340150 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gc4t\" (UniqueName: \"kubernetes.io/projected/5ed30e2e-0f9b-4fa9-8ba1-049ae539875c-kube-api-access-9gc4t\") pod \"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c\" (UID: \"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c\") " Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.341296 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ed30e2e-0f9b-4fa9-8ba1-049ae539875c-catalog-content\") pod \"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c\" (UID: \"5ed30e2e-0f9b-4fa9-8ba1-049ae539875c\") " Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.341400 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ed30e2e-0f9b-4fa9-8ba1-049ae539875c-utilities" (OuterVolumeSpecName: "utilities") pod "5ed30e2e-0f9b-4fa9-8ba1-049ae539875c" (UID: "5ed30e2e-0f9b-4fa9-8ba1-049ae539875c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.341809 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ed30e2e-0f9b-4fa9-8ba1-049ae539875c-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.345647 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ed30e2e-0f9b-4fa9-8ba1-049ae539875c-kube-api-access-9gc4t" (OuterVolumeSpecName: "kube-api-access-9gc4t") pod "5ed30e2e-0f9b-4fa9-8ba1-049ae539875c" (UID: "5ed30e2e-0f9b-4fa9-8ba1-049ae539875c"). InnerVolumeSpecName "kube-api-access-9gc4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.345661 4804 scope.go:117] "RemoveContainer" containerID="488179401e4fcdd5e9b5dc80865809b02744a9f4a1b8ec2955ee17a68c812fbe" Feb 17 13:40:53 crc kubenswrapper[4804]: E0217 13:40:53.346157 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"488179401e4fcdd5e9b5dc80865809b02744a9f4a1b8ec2955ee17a68c812fbe\": container with ID starting with 488179401e4fcdd5e9b5dc80865809b02744a9f4a1b8ec2955ee17a68c812fbe not found: ID does not exist" containerID="488179401e4fcdd5e9b5dc80865809b02744a9f4a1b8ec2955ee17a68c812fbe" Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.346213 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"488179401e4fcdd5e9b5dc80865809b02744a9f4a1b8ec2955ee17a68c812fbe"} err="failed to get container status \"488179401e4fcdd5e9b5dc80865809b02744a9f4a1b8ec2955ee17a68c812fbe\": rpc error: code = NotFound desc = could not find container \"488179401e4fcdd5e9b5dc80865809b02744a9f4a1b8ec2955ee17a68c812fbe\": container with ID starting with 488179401e4fcdd5e9b5dc80865809b02744a9f4a1b8ec2955ee17a68c812fbe not found: ID does not exist" Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.346240 4804 scope.go:117] "RemoveContainer" containerID="214edfbeaf443e51042bde372413712d591bc156f060011f8652b530692849d5" Feb 17 13:40:53 crc kubenswrapper[4804]: E0217 13:40:53.346671 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"214edfbeaf443e51042bde372413712d591bc156f060011f8652b530692849d5\": container with ID starting with 214edfbeaf443e51042bde372413712d591bc156f060011f8652b530692849d5 not found: ID does not exist" containerID="214edfbeaf443e51042bde372413712d591bc156f060011f8652b530692849d5" Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.346711 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"214edfbeaf443e51042bde372413712d591bc156f060011f8652b530692849d5"} err="failed to get container status \"214edfbeaf443e51042bde372413712d591bc156f060011f8652b530692849d5\": rpc error: code = NotFound desc = could not find container \"214edfbeaf443e51042bde372413712d591bc156f060011f8652b530692849d5\": container with ID starting with 214edfbeaf443e51042bde372413712d591bc156f060011f8652b530692849d5 not found: ID does not exist" Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.346737 4804 scope.go:117] "RemoveContainer" containerID="85b9580c0a64570024cd4eea5cf9d7ed8c24206dba24a4136fbb7a1a33535695" Feb 17 13:40:53 crc kubenswrapper[4804]: E0217 13:40:53.347374 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85b9580c0a64570024cd4eea5cf9d7ed8c24206dba24a4136fbb7a1a33535695\": container with ID starting with 85b9580c0a64570024cd4eea5cf9d7ed8c24206dba24a4136fbb7a1a33535695 not found: ID does not exist" containerID="85b9580c0a64570024cd4eea5cf9d7ed8c24206dba24a4136fbb7a1a33535695" Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.347469 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85b9580c0a64570024cd4eea5cf9d7ed8c24206dba24a4136fbb7a1a33535695"} err="failed to get container status \"85b9580c0a64570024cd4eea5cf9d7ed8c24206dba24a4136fbb7a1a33535695\": rpc error: code = NotFound desc = could not find container \"85b9580c0a64570024cd4eea5cf9d7ed8c24206dba24a4136fbb7a1a33535695\": container with ID starting with 85b9580c0a64570024cd4eea5cf9d7ed8c24206dba24a4136fbb7a1a33535695 not found: ID does not exist" Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.394324 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ed30e2e-0f9b-4fa9-8ba1-049ae539875c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5ed30e2e-0f9b-4fa9-8ba1-049ae539875c" (UID: "5ed30e2e-0f9b-4fa9-8ba1-049ae539875c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.442766 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gc4t\" (UniqueName: \"kubernetes.io/projected/5ed30e2e-0f9b-4fa9-8ba1-049ae539875c-kube-api-access-9gc4t\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.442795 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ed30e2e-0f9b-4fa9-8ba1-049ae539875c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.626225 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8bzvf"] Feb 17 13:40:53 crc kubenswrapper[4804]: I0217 13:40:53.650054 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8bzvf"] Feb 17 13:40:54 crc kubenswrapper[4804]: I0217 13:40:54.460382 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-d2786" Feb 17 13:40:54 crc kubenswrapper[4804]: I0217 13:40:54.460667 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-d2786" Feb 17 13:40:54 crc kubenswrapper[4804]: I0217 13:40:54.507463 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-d2786" Feb 17 13:40:54 crc kubenswrapper[4804]: I0217 13:40:54.582306 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ed30e2e-0f9b-4fa9-8ba1-049ae539875c" path="/var/lib/kubelet/pods/5ed30e2e-0f9b-4fa9-8ba1-049ae539875c/volumes" Feb 17 13:40:55 crc kubenswrapper[4804]: I0217 13:40:55.354076 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-d2786" Feb 17 13:40:56 crc kubenswrapper[4804]: I0217 13:40:56.130997 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d2786"] Feb 17 13:40:57 crc kubenswrapper[4804]: I0217 13:40:57.319618 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-d2786" podUID="3d13c70f-ee22-4434-ae7a-92e62c3caa26" containerName="registry-server" containerID="cri-o://56ef89b633db8972f2e239aff7c04b1a035f5ce59b613d27884a8c95e1120457" gracePeriod=2 Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.001516 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h"] Feb 17 13:40:58 crc kubenswrapper[4804]: E0217 13:40:58.001923 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ed30e2e-0f9b-4fa9-8ba1-049ae539875c" containerName="extract-utilities" Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.001952 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ed30e2e-0f9b-4fa9-8ba1-049ae539875c" containerName="extract-utilities" Feb 17 13:40:58 crc kubenswrapper[4804]: E0217 13:40:58.001981 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ed30e2e-0f9b-4fa9-8ba1-049ae539875c" containerName="registry-server" Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.001993 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ed30e2e-0f9b-4fa9-8ba1-049ae539875c" containerName="registry-server" Feb 17 13:40:58 crc kubenswrapper[4804]: E0217 13:40:58.002019 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ed30e2e-0f9b-4fa9-8ba1-049ae539875c" containerName="extract-content" Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.002031 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ed30e2e-0f9b-4fa9-8ba1-049ae539875c" containerName="extract-content" Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.002266 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ed30e2e-0f9b-4fa9-8ba1-049ae539875c" containerName="registry-server" Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.004091 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h" Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.006138 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.017997 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h"] Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.120251 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e8c98d2-433f-46f9-a2f3-3a368c1b2608-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h\" (UID: \"7e8c98d2-433f-46f9-a2f3-3a368c1b2608\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h" Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.120595 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2dll\" (UniqueName: \"kubernetes.io/projected/7e8c98d2-433f-46f9-a2f3-3a368c1b2608-kube-api-access-x2dll\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h\" (UID: \"7e8c98d2-433f-46f9-a2f3-3a368c1b2608\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h" Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.120645 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e8c98d2-433f-46f9-a2f3-3a368c1b2608-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h\" (UID: \"7e8c98d2-433f-46f9-a2f3-3a368c1b2608\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h" Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.221719 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e8c98d2-433f-46f9-a2f3-3a368c1b2608-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h\" (UID: \"7e8c98d2-433f-46f9-a2f3-3a368c1b2608\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h" Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.221806 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2dll\" (UniqueName: \"kubernetes.io/projected/7e8c98d2-433f-46f9-a2f3-3a368c1b2608-kube-api-access-x2dll\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h\" (UID: \"7e8c98d2-433f-46f9-a2f3-3a368c1b2608\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h" Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.221831 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e8c98d2-433f-46f9-a2f3-3a368c1b2608-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h\" (UID: \"7e8c98d2-433f-46f9-a2f3-3a368c1b2608\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h" Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.222357 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e8c98d2-433f-46f9-a2f3-3a368c1b2608-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h\" (UID: \"7e8c98d2-433f-46f9-a2f3-3a368c1b2608\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h" Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.222617 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e8c98d2-433f-46f9-a2f3-3a368c1b2608-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h\" (UID: \"7e8c98d2-433f-46f9-a2f3-3a368c1b2608\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h" Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.246766 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2dll\" (UniqueName: \"kubernetes.io/projected/7e8c98d2-433f-46f9-a2f3-3a368c1b2608-kube-api-access-x2dll\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h\" (UID: \"7e8c98d2-433f-46f9-a2f3-3a368c1b2608\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h" Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.327129 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2786" event={"ID":"3d13c70f-ee22-4434-ae7a-92e62c3caa26","Type":"ContainerDied","Data":"56ef89b633db8972f2e239aff7c04b1a035f5ce59b613d27884a8c95e1120457"} Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.327059 4804 generic.go:334] "Generic (PLEG): container finished" podID="3d13c70f-ee22-4434-ae7a-92e62c3caa26" containerID="56ef89b633db8972f2e239aff7c04b1a035f5ce59b613d27884a8c95e1120457" exitCode=0 Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.336119 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h" Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.523877 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h"] Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.839382 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d2786" Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.933243 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d13c70f-ee22-4434-ae7a-92e62c3caa26-catalog-content\") pod \"3d13c70f-ee22-4434-ae7a-92e62c3caa26\" (UID: \"3d13c70f-ee22-4434-ae7a-92e62c3caa26\") " Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.933330 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzz5c\" (UniqueName: \"kubernetes.io/projected/3d13c70f-ee22-4434-ae7a-92e62c3caa26-kube-api-access-pzz5c\") pod \"3d13c70f-ee22-4434-ae7a-92e62c3caa26\" (UID: \"3d13c70f-ee22-4434-ae7a-92e62c3caa26\") " Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.933379 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d13c70f-ee22-4434-ae7a-92e62c3caa26-utilities\") pod \"3d13c70f-ee22-4434-ae7a-92e62c3caa26\" (UID: \"3d13c70f-ee22-4434-ae7a-92e62c3caa26\") " Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.934671 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d13c70f-ee22-4434-ae7a-92e62c3caa26-utilities" (OuterVolumeSpecName: "utilities") pod "3d13c70f-ee22-4434-ae7a-92e62c3caa26" (UID: "3d13c70f-ee22-4434-ae7a-92e62c3caa26"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:40:58 crc kubenswrapper[4804]: I0217 13:40:58.940248 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d13c70f-ee22-4434-ae7a-92e62c3caa26-kube-api-access-pzz5c" (OuterVolumeSpecName: "kube-api-access-pzz5c") pod "3d13c70f-ee22-4434-ae7a-92e62c3caa26" (UID: "3d13c70f-ee22-4434-ae7a-92e62c3caa26"). InnerVolumeSpecName "kube-api-access-pzz5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.009807 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d13c70f-ee22-4434-ae7a-92e62c3caa26-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d13c70f-ee22-4434-ae7a-92e62c3caa26" (UID: "3d13c70f-ee22-4434-ae7a-92e62c3caa26"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.034821 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d13c70f-ee22-4434-ae7a-92e62c3caa26-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.034857 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d13c70f-ee22-4434-ae7a-92e62c3caa26-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.034871 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzz5c\" (UniqueName: \"kubernetes.io/projected/3d13c70f-ee22-4434-ae7a-92e62c3caa26-kube-api-access-pzz5c\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.266033 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-tz5vz" podUID="9eb6b4b9-9e2e-4f39-9df0-068cfea71701" containerName="console" containerID="cri-o://f15f6246225d4108f70918cfaf0b42c868ccbbfd666ad08822316ad0295da5d4" gracePeriod=15 Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.335726 4804 generic.go:334] "Generic (PLEG): container finished" podID="7e8c98d2-433f-46f9-a2f3-3a368c1b2608" containerID="a312566d330b5b43cd5b5e5db5ba88efab0b15fe3b0a36d64c0962e38572777f" exitCode=0 Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.335812 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h" event={"ID":"7e8c98d2-433f-46f9-a2f3-3a368c1b2608","Type":"ContainerDied","Data":"a312566d330b5b43cd5b5e5db5ba88efab0b15fe3b0a36d64c0962e38572777f"} Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.335842 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h" event={"ID":"7e8c98d2-433f-46f9-a2f3-3a368c1b2608","Type":"ContainerStarted","Data":"ed4b034533ab832fa3d6d9792b943b6994648cbfe779c359e76481aaf00925de"} Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.339890 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2786" event={"ID":"3d13c70f-ee22-4434-ae7a-92e62c3caa26","Type":"ContainerDied","Data":"2d5cbab8cf904e1f2afff630660ba9ad4d8260633fdac34c04024ed3278b2e02"} Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.340246 4804 scope.go:117] "RemoveContainer" containerID="56ef89b633db8972f2e239aff7c04b1a035f5ce59b613d27884a8c95e1120457" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.340400 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d2786" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.411116 4804 scope.go:117] "RemoveContainer" containerID="1fe5b1d23d0bb27a90ac079b9dc22996fe723a550aa2c5391daa2ed178e26f28" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.412312 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d2786"] Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.419372 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-d2786"] Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.431450 4804 scope.go:117] "RemoveContainer" containerID="2841989cfb43995c971c4405cddc2c9830da84b3d169d1f91be7e47313003065" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.600266 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-tz5vz_9eb6b4b9-9e2e-4f39-9df0-068cfea71701/console/0.log" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.600581 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.744865 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-console-serving-cert\") pod \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.744926 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-service-ca\") pod \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.744946 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xq22q\" (UniqueName: \"kubernetes.io/projected/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-kube-api-access-xq22q\") pod \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.745001 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-console-config\") pod \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.745028 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-trusted-ca-bundle\") pod \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.745094 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-console-oauth-config\") pod \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.745117 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-oauth-serving-cert\") pod \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\" (UID: \"9eb6b4b9-9e2e-4f39-9df0-068cfea71701\") " Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.745807 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9eb6b4b9-9e2e-4f39-9df0-068cfea71701" (UID: "9eb6b4b9-9e2e-4f39-9df0-068cfea71701"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.745813 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9eb6b4b9-9e2e-4f39-9df0-068cfea71701" (UID: "9eb6b4b9-9e2e-4f39-9df0-068cfea71701"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.745881 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-service-ca" (OuterVolumeSpecName: "service-ca") pod "9eb6b4b9-9e2e-4f39-9df0-068cfea71701" (UID: "9eb6b4b9-9e2e-4f39-9df0-068cfea71701"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.746142 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-console-config" (OuterVolumeSpecName: "console-config") pod "9eb6b4b9-9e2e-4f39-9df0-068cfea71701" (UID: "9eb6b4b9-9e2e-4f39-9df0-068cfea71701"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.756123 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9eb6b4b9-9e2e-4f39-9df0-068cfea71701" (UID: "9eb6b4b9-9e2e-4f39-9df0-068cfea71701"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.756306 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-kube-api-access-xq22q" (OuterVolumeSpecName: "kube-api-access-xq22q") pod "9eb6b4b9-9e2e-4f39-9df0-068cfea71701" (UID: "9eb6b4b9-9e2e-4f39-9df0-068cfea71701"). InnerVolumeSpecName "kube-api-access-xq22q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.756465 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9eb6b4b9-9e2e-4f39-9df0-068cfea71701" (UID: "9eb6b4b9-9e2e-4f39-9df0-068cfea71701"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.847283 4804 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.847322 4804 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.847332 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xq22q\" (UniqueName: \"kubernetes.io/projected/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-kube-api-access-xq22q\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.847342 4804 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-console-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.847353 4804 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.847361 4804 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:40:59 crc kubenswrapper[4804]: I0217 13:40:59.847369 4804 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9eb6b4b9-9e2e-4f39-9df0-068cfea71701-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 13:41:00 crc kubenswrapper[4804]: I0217 13:41:00.347979 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-tz5vz_9eb6b4b9-9e2e-4f39-9df0-068cfea71701/console/0.log" Feb 17 13:41:00 crc kubenswrapper[4804]: I0217 13:41:00.348323 4804 generic.go:334] "Generic (PLEG): container finished" podID="9eb6b4b9-9e2e-4f39-9df0-068cfea71701" containerID="f15f6246225d4108f70918cfaf0b42c868ccbbfd666ad08822316ad0295da5d4" exitCode=2 Feb 17 13:41:00 crc kubenswrapper[4804]: I0217 13:41:00.348398 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-tz5vz" event={"ID":"9eb6b4b9-9e2e-4f39-9df0-068cfea71701","Type":"ContainerDied","Data":"f15f6246225d4108f70918cfaf0b42c868ccbbfd666ad08822316ad0295da5d4"} Feb 17 13:41:00 crc kubenswrapper[4804]: I0217 13:41:00.348438 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-tz5vz" event={"ID":"9eb6b4b9-9e2e-4f39-9df0-068cfea71701","Type":"ContainerDied","Data":"981cd8ca6939145b19efcac42c0b745084dc50ef247139e74f5af40d78e085ba"} Feb 17 13:41:00 crc kubenswrapper[4804]: I0217 13:41:00.348402 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-tz5vz" Feb 17 13:41:00 crc kubenswrapper[4804]: I0217 13:41:00.348464 4804 scope.go:117] "RemoveContainer" containerID="f15f6246225d4108f70918cfaf0b42c868ccbbfd666ad08822316ad0295da5d4" Feb 17 13:41:00 crc kubenswrapper[4804]: I0217 13:41:00.386427 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-tz5vz"] Feb 17 13:41:00 crc kubenswrapper[4804]: I0217 13:41:00.387249 4804 scope.go:117] "RemoveContainer" containerID="f15f6246225d4108f70918cfaf0b42c868ccbbfd666ad08822316ad0295da5d4" Feb 17 13:41:00 crc kubenswrapper[4804]: E0217 13:41:00.388637 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f15f6246225d4108f70918cfaf0b42c868ccbbfd666ad08822316ad0295da5d4\": container with ID starting with f15f6246225d4108f70918cfaf0b42c868ccbbfd666ad08822316ad0295da5d4 not found: ID does not exist" containerID="f15f6246225d4108f70918cfaf0b42c868ccbbfd666ad08822316ad0295da5d4" Feb 17 13:41:00 crc kubenswrapper[4804]: I0217 13:41:00.388687 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f15f6246225d4108f70918cfaf0b42c868ccbbfd666ad08822316ad0295da5d4"} err="failed to get container status \"f15f6246225d4108f70918cfaf0b42c868ccbbfd666ad08822316ad0295da5d4\": rpc error: code = NotFound desc = could not find container \"f15f6246225d4108f70918cfaf0b42c868ccbbfd666ad08822316ad0295da5d4\": container with ID starting with f15f6246225d4108f70918cfaf0b42c868ccbbfd666ad08822316ad0295da5d4 not found: ID does not exist" Feb 17 13:41:00 crc kubenswrapper[4804]: I0217 13:41:00.394966 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-tz5vz"] Feb 17 13:41:00 crc kubenswrapper[4804]: I0217 13:41:00.579269 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d13c70f-ee22-4434-ae7a-92e62c3caa26" path="/var/lib/kubelet/pods/3d13c70f-ee22-4434-ae7a-92e62c3caa26/volumes" Feb 17 13:41:00 crc kubenswrapper[4804]: I0217 13:41:00.580056 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9eb6b4b9-9e2e-4f39-9df0-068cfea71701" path="/var/lib/kubelet/pods/9eb6b4b9-9e2e-4f39-9df0-068cfea71701/volumes" Feb 17 13:41:01 crc kubenswrapper[4804]: I0217 13:41:01.359990 4804 generic.go:334] "Generic (PLEG): container finished" podID="7e8c98d2-433f-46f9-a2f3-3a368c1b2608" containerID="079f52519ec7defe8849f5cf7a1dd012d358385362721d77c032579b21d6da77" exitCode=0 Feb 17 13:41:01 crc kubenswrapper[4804]: I0217 13:41:01.360096 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h" event={"ID":"7e8c98d2-433f-46f9-a2f3-3a368c1b2608","Type":"ContainerDied","Data":"079f52519ec7defe8849f5cf7a1dd012d358385362721d77c032579b21d6da77"} Feb 17 13:41:02 crc kubenswrapper[4804]: I0217 13:41:02.379346 4804 generic.go:334] "Generic (PLEG): container finished" podID="7e8c98d2-433f-46f9-a2f3-3a368c1b2608" containerID="36d3022c9303016a4bff30c0b94a6e380411ddc9446661e703721a0c232e7d03" exitCode=0 Feb 17 13:41:02 crc kubenswrapper[4804]: I0217 13:41:02.379415 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h" event={"ID":"7e8c98d2-433f-46f9-a2f3-3a368c1b2608","Type":"ContainerDied","Data":"36d3022c9303016a4bff30c0b94a6e380411ddc9446661e703721a0c232e7d03"} Feb 17 13:41:03 crc kubenswrapper[4804]: I0217 13:41:03.686520 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h" Feb 17 13:41:03 crc kubenswrapper[4804]: I0217 13:41:03.808175 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2dll\" (UniqueName: \"kubernetes.io/projected/7e8c98d2-433f-46f9-a2f3-3a368c1b2608-kube-api-access-x2dll\") pod \"7e8c98d2-433f-46f9-a2f3-3a368c1b2608\" (UID: \"7e8c98d2-433f-46f9-a2f3-3a368c1b2608\") " Feb 17 13:41:03 crc kubenswrapper[4804]: I0217 13:41:03.808249 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e8c98d2-433f-46f9-a2f3-3a368c1b2608-util\") pod \"7e8c98d2-433f-46f9-a2f3-3a368c1b2608\" (UID: \"7e8c98d2-433f-46f9-a2f3-3a368c1b2608\") " Feb 17 13:41:03 crc kubenswrapper[4804]: I0217 13:41:03.808278 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e8c98d2-433f-46f9-a2f3-3a368c1b2608-bundle\") pod \"7e8c98d2-433f-46f9-a2f3-3a368c1b2608\" (UID: \"7e8c98d2-433f-46f9-a2f3-3a368c1b2608\") " Feb 17 13:41:03 crc kubenswrapper[4804]: I0217 13:41:03.809370 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e8c98d2-433f-46f9-a2f3-3a368c1b2608-bundle" (OuterVolumeSpecName: "bundle") pod "7e8c98d2-433f-46f9-a2f3-3a368c1b2608" (UID: "7e8c98d2-433f-46f9-a2f3-3a368c1b2608"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:41:03 crc kubenswrapper[4804]: I0217 13:41:03.813687 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e8c98d2-433f-46f9-a2f3-3a368c1b2608-kube-api-access-x2dll" (OuterVolumeSpecName: "kube-api-access-x2dll") pod "7e8c98d2-433f-46f9-a2f3-3a368c1b2608" (UID: "7e8c98d2-433f-46f9-a2f3-3a368c1b2608"). InnerVolumeSpecName "kube-api-access-x2dll". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:41:03 crc kubenswrapper[4804]: I0217 13:41:03.839701 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e8c98d2-433f-46f9-a2f3-3a368c1b2608-util" (OuterVolumeSpecName: "util") pod "7e8c98d2-433f-46f9-a2f3-3a368c1b2608" (UID: "7e8c98d2-433f-46f9-a2f3-3a368c1b2608"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:41:03 crc kubenswrapper[4804]: I0217 13:41:03.909993 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2dll\" (UniqueName: \"kubernetes.io/projected/7e8c98d2-433f-46f9-a2f3-3a368c1b2608-kube-api-access-x2dll\") on node \"crc\" DevicePath \"\"" Feb 17 13:41:03 crc kubenswrapper[4804]: I0217 13:41:03.910025 4804 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e8c98d2-433f-46f9-a2f3-3a368c1b2608-util\") on node \"crc\" DevicePath \"\"" Feb 17 13:41:03 crc kubenswrapper[4804]: I0217 13:41:03.910041 4804 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e8c98d2-433f-46f9-a2f3-3a368c1b2608-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:41:04 crc kubenswrapper[4804]: I0217 13:41:04.396494 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h" event={"ID":"7e8c98d2-433f-46f9-a2f3-3a368c1b2608","Type":"ContainerDied","Data":"ed4b034533ab832fa3d6d9792b943b6994648cbfe779c359e76481aaf00925de"} Feb 17 13:41:04 crc kubenswrapper[4804]: I0217 13:41:04.396541 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h" Feb 17 13:41:04 crc kubenswrapper[4804]: I0217 13:41:04.396555 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed4b034533ab832fa3d6d9792b943b6994648cbfe779c359e76481aaf00925de" Feb 17 13:41:11 crc kubenswrapper[4804]: I0217 13:41:11.841316 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-c7c468df9-kbjlb"] Feb 17 13:41:11 crc kubenswrapper[4804]: E0217 13:41:11.842086 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9eb6b4b9-9e2e-4f39-9df0-068cfea71701" containerName="console" Feb 17 13:41:11 crc kubenswrapper[4804]: I0217 13:41:11.842099 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="9eb6b4b9-9e2e-4f39-9df0-068cfea71701" containerName="console" Feb 17 13:41:11 crc kubenswrapper[4804]: E0217 13:41:11.842109 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d13c70f-ee22-4434-ae7a-92e62c3caa26" containerName="extract-utilities" Feb 17 13:41:11 crc kubenswrapper[4804]: I0217 13:41:11.842116 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d13c70f-ee22-4434-ae7a-92e62c3caa26" containerName="extract-utilities" Feb 17 13:41:11 crc kubenswrapper[4804]: E0217 13:41:11.842125 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e8c98d2-433f-46f9-a2f3-3a368c1b2608" containerName="pull" Feb 17 13:41:11 crc kubenswrapper[4804]: I0217 13:41:11.842131 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e8c98d2-433f-46f9-a2f3-3a368c1b2608" containerName="pull" Feb 17 13:41:11 crc kubenswrapper[4804]: E0217 13:41:11.842139 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d13c70f-ee22-4434-ae7a-92e62c3caa26" containerName="registry-server" Feb 17 13:41:11 crc kubenswrapper[4804]: I0217 13:41:11.842145 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d13c70f-ee22-4434-ae7a-92e62c3caa26" containerName="registry-server" Feb 17 13:41:11 crc kubenswrapper[4804]: E0217 13:41:11.842159 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e8c98d2-433f-46f9-a2f3-3a368c1b2608" containerName="extract" Feb 17 13:41:11 crc kubenswrapper[4804]: I0217 13:41:11.842166 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e8c98d2-433f-46f9-a2f3-3a368c1b2608" containerName="extract" Feb 17 13:41:11 crc kubenswrapper[4804]: E0217 13:41:11.842178 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d13c70f-ee22-4434-ae7a-92e62c3caa26" containerName="extract-content" Feb 17 13:41:11 crc kubenswrapper[4804]: I0217 13:41:11.842185 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d13c70f-ee22-4434-ae7a-92e62c3caa26" containerName="extract-content" Feb 17 13:41:11 crc kubenswrapper[4804]: E0217 13:41:11.842226 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e8c98d2-433f-46f9-a2f3-3a368c1b2608" containerName="util" Feb 17 13:41:11 crc kubenswrapper[4804]: I0217 13:41:11.842234 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e8c98d2-433f-46f9-a2f3-3a368c1b2608" containerName="util" Feb 17 13:41:11 crc kubenswrapper[4804]: I0217 13:41:11.842351 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d13c70f-ee22-4434-ae7a-92e62c3caa26" containerName="registry-server" Feb 17 13:41:11 crc kubenswrapper[4804]: I0217 13:41:11.842363 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e8c98d2-433f-46f9-a2f3-3a368c1b2608" containerName="extract" Feb 17 13:41:11 crc kubenswrapper[4804]: I0217 13:41:11.842381 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="9eb6b4b9-9e2e-4f39-9df0-068cfea71701" containerName="console" Feb 17 13:41:11 crc kubenswrapper[4804]: I0217 13:41:11.842855 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-c7c468df9-kbjlb" Feb 17 13:41:11 crc kubenswrapper[4804]: I0217 13:41:11.844743 4804 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 17 13:41:11 crc kubenswrapper[4804]: I0217 13:41:11.844743 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 17 13:41:11 crc kubenswrapper[4804]: I0217 13:41:11.845608 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 17 13:41:11 crc kubenswrapper[4804]: I0217 13:41:11.845870 4804 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 17 13:41:11 crc kubenswrapper[4804]: I0217 13:41:11.846084 4804 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-pwpsc" Feb 17 13:41:11 crc kubenswrapper[4804]: I0217 13:41:11.862311 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-c7c468df9-kbjlb"] Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.009239 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c17333d4-cfc6-4129-af9e-a8f2db54988b-apiservice-cert\") pod \"metallb-operator-controller-manager-c7c468df9-kbjlb\" (UID: \"c17333d4-cfc6-4129-af9e-a8f2db54988b\") " pod="metallb-system/metallb-operator-controller-manager-c7c468df9-kbjlb" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.009311 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c17333d4-cfc6-4129-af9e-a8f2db54988b-webhook-cert\") pod \"metallb-operator-controller-manager-c7c468df9-kbjlb\" (UID: \"c17333d4-cfc6-4129-af9e-a8f2db54988b\") " pod="metallb-system/metallb-operator-controller-manager-c7c468df9-kbjlb" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.009329 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q2zj\" (UniqueName: \"kubernetes.io/projected/c17333d4-cfc6-4129-af9e-a8f2db54988b-kube-api-access-9q2zj\") pod \"metallb-operator-controller-manager-c7c468df9-kbjlb\" (UID: \"c17333d4-cfc6-4129-af9e-a8f2db54988b\") " pod="metallb-system/metallb-operator-controller-manager-c7c468df9-kbjlb" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.110988 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c17333d4-cfc6-4129-af9e-a8f2db54988b-apiservice-cert\") pod \"metallb-operator-controller-manager-c7c468df9-kbjlb\" (UID: \"c17333d4-cfc6-4129-af9e-a8f2db54988b\") " pod="metallb-system/metallb-operator-controller-manager-c7c468df9-kbjlb" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.111073 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c17333d4-cfc6-4129-af9e-a8f2db54988b-webhook-cert\") pod \"metallb-operator-controller-manager-c7c468df9-kbjlb\" (UID: \"c17333d4-cfc6-4129-af9e-a8f2db54988b\") " pod="metallb-system/metallb-operator-controller-manager-c7c468df9-kbjlb" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.111095 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q2zj\" (UniqueName: \"kubernetes.io/projected/c17333d4-cfc6-4129-af9e-a8f2db54988b-kube-api-access-9q2zj\") pod \"metallb-operator-controller-manager-c7c468df9-kbjlb\" (UID: \"c17333d4-cfc6-4129-af9e-a8f2db54988b\") " pod="metallb-system/metallb-operator-controller-manager-c7c468df9-kbjlb" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.119696 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c17333d4-cfc6-4129-af9e-a8f2db54988b-webhook-cert\") pod \"metallb-operator-controller-manager-c7c468df9-kbjlb\" (UID: \"c17333d4-cfc6-4129-af9e-a8f2db54988b\") " pod="metallb-system/metallb-operator-controller-manager-c7c468df9-kbjlb" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.129730 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c17333d4-cfc6-4129-af9e-a8f2db54988b-apiservice-cert\") pod \"metallb-operator-controller-manager-c7c468df9-kbjlb\" (UID: \"c17333d4-cfc6-4129-af9e-a8f2db54988b\") " pod="metallb-system/metallb-operator-controller-manager-c7c468df9-kbjlb" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.142025 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q2zj\" (UniqueName: \"kubernetes.io/projected/c17333d4-cfc6-4129-af9e-a8f2db54988b-kube-api-access-9q2zj\") pod \"metallb-operator-controller-manager-c7c468df9-kbjlb\" (UID: \"c17333d4-cfc6-4129-af9e-a8f2db54988b\") " pod="metallb-system/metallb-operator-controller-manager-c7c468df9-kbjlb" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.144663 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-996ff79d9-vm8dt"] Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.145339 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-996ff79d9-vm8dt" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.152257 4804 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.152566 4804 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.153070 4804 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-h58kr" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.158246 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-c7c468df9-kbjlb" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.167818 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-996ff79d9-vm8dt"] Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.313411 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/82716046-7f15-43d7-b9de-8fdb68a44c0b-webhook-cert\") pod \"metallb-operator-webhook-server-996ff79d9-vm8dt\" (UID: \"82716046-7f15-43d7-b9de-8fdb68a44c0b\") " pod="metallb-system/metallb-operator-webhook-server-996ff79d9-vm8dt" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.313751 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/82716046-7f15-43d7-b9de-8fdb68a44c0b-apiservice-cert\") pod \"metallb-operator-webhook-server-996ff79d9-vm8dt\" (UID: \"82716046-7f15-43d7-b9de-8fdb68a44c0b\") " pod="metallb-system/metallb-operator-webhook-server-996ff79d9-vm8dt" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.313808 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbp76\" (UniqueName: \"kubernetes.io/projected/82716046-7f15-43d7-b9de-8fdb68a44c0b-kube-api-access-bbp76\") pod \"metallb-operator-webhook-server-996ff79d9-vm8dt\" (UID: \"82716046-7f15-43d7-b9de-8fdb68a44c0b\") " pod="metallb-system/metallb-operator-webhook-server-996ff79d9-vm8dt" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.414706 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/82716046-7f15-43d7-b9de-8fdb68a44c0b-webhook-cert\") pod \"metallb-operator-webhook-server-996ff79d9-vm8dt\" (UID: \"82716046-7f15-43d7-b9de-8fdb68a44c0b\") " pod="metallb-system/metallb-operator-webhook-server-996ff79d9-vm8dt" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.414764 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/82716046-7f15-43d7-b9de-8fdb68a44c0b-apiservice-cert\") pod \"metallb-operator-webhook-server-996ff79d9-vm8dt\" (UID: \"82716046-7f15-43d7-b9de-8fdb68a44c0b\") " pod="metallb-system/metallb-operator-webhook-server-996ff79d9-vm8dt" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.414793 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbp76\" (UniqueName: \"kubernetes.io/projected/82716046-7f15-43d7-b9de-8fdb68a44c0b-kube-api-access-bbp76\") pod \"metallb-operator-webhook-server-996ff79d9-vm8dt\" (UID: \"82716046-7f15-43d7-b9de-8fdb68a44c0b\") " pod="metallb-system/metallb-operator-webhook-server-996ff79d9-vm8dt" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.418737 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/82716046-7f15-43d7-b9de-8fdb68a44c0b-apiservice-cert\") pod \"metallb-operator-webhook-server-996ff79d9-vm8dt\" (UID: \"82716046-7f15-43d7-b9de-8fdb68a44c0b\") " pod="metallb-system/metallb-operator-webhook-server-996ff79d9-vm8dt" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.430881 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/82716046-7f15-43d7-b9de-8fdb68a44c0b-webhook-cert\") pod \"metallb-operator-webhook-server-996ff79d9-vm8dt\" (UID: \"82716046-7f15-43d7-b9de-8fdb68a44c0b\") " pod="metallb-system/metallb-operator-webhook-server-996ff79d9-vm8dt" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.431617 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbp76\" (UniqueName: \"kubernetes.io/projected/82716046-7f15-43d7-b9de-8fdb68a44c0b-kube-api-access-bbp76\") pod \"metallb-operator-webhook-server-996ff79d9-vm8dt\" (UID: \"82716046-7f15-43d7-b9de-8fdb68a44c0b\") " pod="metallb-system/metallb-operator-webhook-server-996ff79d9-vm8dt" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.445528 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-c7c468df9-kbjlb"] Feb 17 13:41:12 crc kubenswrapper[4804]: W0217 13:41:12.451022 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc17333d4_cfc6_4129_af9e_a8f2db54988b.slice/crio-f9258481e9a188b243a294c267c401977e29b563dc1bfaff0064063e46945866 WatchSource:0}: Error finding container f9258481e9a188b243a294c267c401977e29b563dc1bfaff0064063e46945866: Status 404 returned error can't find the container with id f9258481e9a188b243a294c267c401977e29b563dc1bfaff0064063e46945866 Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.528533 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-996ff79d9-vm8dt" Feb 17 13:41:12 crc kubenswrapper[4804]: I0217 13:41:12.722910 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-996ff79d9-vm8dt"] Feb 17 13:41:12 crc kubenswrapper[4804]: W0217 13:41:12.729490 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82716046_7f15_43d7_b9de_8fdb68a44c0b.slice/crio-1ad54914a432bf870ed2a155bd52042a044e33ace0156238e77e63d842730e97 WatchSource:0}: Error finding container 1ad54914a432bf870ed2a155bd52042a044e33ace0156238e77e63d842730e97: Status 404 returned error can't find the container with id 1ad54914a432bf870ed2a155bd52042a044e33ace0156238e77e63d842730e97 Feb 17 13:41:13 crc kubenswrapper[4804]: I0217 13:41:13.459223 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-c7c468df9-kbjlb" event={"ID":"c17333d4-cfc6-4129-af9e-a8f2db54988b","Type":"ContainerStarted","Data":"f9258481e9a188b243a294c267c401977e29b563dc1bfaff0064063e46945866"} Feb 17 13:41:13 crc kubenswrapper[4804]: I0217 13:41:13.461075 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-996ff79d9-vm8dt" event={"ID":"82716046-7f15-43d7-b9de-8fdb68a44c0b","Type":"ContainerStarted","Data":"1ad54914a432bf870ed2a155bd52042a044e33ace0156238e77e63d842730e97"} Feb 17 13:41:15 crc kubenswrapper[4804]: I0217 13:41:15.483859 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-c7c468df9-kbjlb" event={"ID":"c17333d4-cfc6-4129-af9e-a8f2db54988b","Type":"ContainerStarted","Data":"7bd1ef8d29be94d011cdfeb8205fcb4af1e446114c5e0bc34cad73b7049c8bf8"} Feb 17 13:41:15 crc kubenswrapper[4804]: I0217 13:41:15.484283 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-c7c468df9-kbjlb" Feb 17 13:41:15 crc kubenswrapper[4804]: I0217 13:41:15.511418 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-c7c468df9-kbjlb" podStartSLOduration=1.8495849130000002 podStartE2EDuration="4.5114007s" podCreationTimestamp="2026-02-17 13:41:11 +0000 UTC" firstStartedPulling="2026-02-17 13:41:12.454247242 +0000 UTC m=+946.565666579" lastFinishedPulling="2026-02-17 13:41:15.116063029 +0000 UTC m=+949.227482366" observedRunningTime="2026-02-17 13:41:15.509745948 +0000 UTC m=+949.621165285" watchObservedRunningTime="2026-02-17 13:41:15.5114007 +0000 UTC m=+949.622820037" Feb 17 13:41:17 crc kubenswrapper[4804]: I0217 13:41:17.496011 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-996ff79d9-vm8dt" event={"ID":"82716046-7f15-43d7-b9de-8fdb68a44c0b","Type":"ContainerStarted","Data":"c009fabd2863594a3f1f3c18019679459783c10d7e581bcda3a7b8fdd4b96759"} Feb 17 13:41:17 crc kubenswrapper[4804]: I0217 13:41:17.496383 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-996ff79d9-vm8dt" Feb 17 13:41:32 crc kubenswrapper[4804]: I0217 13:41:32.532363 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-996ff79d9-vm8dt" Feb 17 13:41:32 crc kubenswrapper[4804]: I0217 13:41:32.547499 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-996ff79d9-vm8dt" podStartSLOduration=16.074899336 podStartE2EDuration="20.547484859s" podCreationTimestamp="2026-02-17 13:41:12 +0000 UTC" firstStartedPulling="2026-02-17 13:41:12.733271813 +0000 UTC m=+946.844691150" lastFinishedPulling="2026-02-17 13:41:17.205857336 +0000 UTC m=+951.317276673" observedRunningTime="2026-02-17 13:41:17.518901384 +0000 UTC m=+951.630320741" watchObservedRunningTime="2026-02-17 13:41:32.547484859 +0000 UTC m=+966.658904196" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.161289 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-c7c468df9-kbjlb" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.857738 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-5ls9t"] Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.861110 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.863361 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.863593 4804 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-6z9h5" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.863785 4804 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.865759 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-gl8tp"] Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.866681 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gl8tp" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.868419 4804 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.869275 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d003d1c-2370-4291-a035-0ebe8b97cfee-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-gl8tp\" (UID: \"0d003d1c-2370-4291-a035-0ebe8b97cfee\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gl8tp" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.869338 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/2cf110f6-e70a-45af-a634-744262733250-frr-sockets\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.869364 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hllj\" (UniqueName: \"kubernetes.io/projected/0d003d1c-2370-4291-a035-0ebe8b97cfee-kube-api-access-2hllj\") pod \"frr-k8s-webhook-server-78b44bf5bb-gl8tp\" (UID: \"0d003d1c-2370-4291-a035-0ebe8b97cfee\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gl8tp" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.869396 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njbjj\" (UniqueName: \"kubernetes.io/projected/2cf110f6-e70a-45af-a634-744262733250-kube-api-access-njbjj\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.869440 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/2cf110f6-e70a-45af-a634-744262733250-metrics\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.869473 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/2cf110f6-e70a-45af-a634-744262733250-frr-startup\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.869499 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2cf110f6-e70a-45af-a634-744262733250-metrics-certs\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.869526 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/2cf110f6-e70a-45af-a634-744262733250-frr-conf\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.869550 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/2cf110f6-e70a-45af-a634-744262733250-reloader\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.876014 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-gl8tp"] Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.943931 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-wrsrf"] Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.944975 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-wrsrf" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.954819 4804 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.954833 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.954978 4804 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-5hn8c" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.955380 4804 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.970035 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d003d1c-2370-4291-a035-0ebe8b97cfee-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-gl8tp\" (UID: \"0d003d1c-2370-4291-a035-0ebe8b97cfee\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gl8tp" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.970095 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/2cf110f6-e70a-45af-a634-744262733250-frr-sockets\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.970125 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hllj\" (UniqueName: \"kubernetes.io/projected/0d003d1c-2370-4291-a035-0ebe8b97cfee-kube-api-access-2hllj\") pod \"frr-k8s-webhook-server-78b44bf5bb-gl8tp\" (UID: \"0d003d1c-2370-4291-a035-0ebe8b97cfee\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gl8tp" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.970157 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhngs\" (UniqueName: \"kubernetes.io/projected/ef60181c-19a6-454c-a197-2b0af0ac2edf-kube-api-access-rhngs\") pod \"speaker-wrsrf\" (UID: \"ef60181c-19a6-454c-a197-2b0af0ac2edf\") " pod="metallb-system/speaker-wrsrf" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.970181 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njbjj\" (UniqueName: \"kubernetes.io/projected/2cf110f6-e70a-45af-a634-744262733250-kube-api-access-njbjj\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: E0217 13:41:52.970238 4804 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Feb 17 13:41:52 crc kubenswrapper[4804]: E0217 13:41:52.970317 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d003d1c-2370-4291-a035-0ebe8b97cfee-cert podName:0d003d1c-2370-4291-a035-0ebe8b97cfee nodeName:}" failed. No retries permitted until 2026-02-17 13:41:53.470297801 +0000 UTC m=+987.581717138 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d003d1c-2370-4291-a035-0ebe8b97cfee-cert") pod "frr-k8s-webhook-server-78b44bf5bb-gl8tp" (UID: "0d003d1c-2370-4291-a035-0ebe8b97cfee") : secret "frr-k8s-webhook-server-cert" not found Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.970239 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/2cf110f6-e70a-45af-a634-744262733250-metrics\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.970522 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ef60181c-19a6-454c-a197-2b0af0ac2edf-metallb-excludel2\") pod \"speaker-wrsrf\" (UID: \"ef60181c-19a6-454c-a197-2b0af0ac2edf\") " pod="metallb-system/speaker-wrsrf" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.970617 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/2cf110f6-e70a-45af-a634-744262733250-frr-sockets\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.970645 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef60181c-19a6-454c-a197-2b0af0ac2edf-metrics-certs\") pod \"speaker-wrsrf\" (UID: \"ef60181c-19a6-454c-a197-2b0af0ac2edf\") " pod="metallb-system/speaker-wrsrf" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.970660 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/2cf110f6-e70a-45af-a634-744262733250-metrics\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.970708 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/2cf110f6-e70a-45af-a634-744262733250-frr-startup\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.970924 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2cf110f6-e70a-45af-a634-744262733250-metrics-certs\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.971008 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/2cf110f6-e70a-45af-a634-744262733250-frr-conf\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.971088 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/2cf110f6-e70a-45af-a634-744262733250-reloader\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.971241 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ef60181c-19a6-454c-a197-2b0af0ac2edf-memberlist\") pod \"speaker-wrsrf\" (UID: \"ef60181c-19a6-454c-a197-2b0af0ac2edf\") " pod="metallb-system/speaker-wrsrf" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.971498 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/2cf110f6-e70a-45af-a634-744262733250-frr-startup\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.971745 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/2cf110f6-e70a-45af-a634-744262733250-frr-conf\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.971974 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/2cf110f6-e70a-45af-a634-744262733250-reloader\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.981497 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-69bbfbf88f-wg4pd"] Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.982504 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-wg4pd" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.988367 4804 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.988861 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2cf110f6-e70a-45af-a634-744262733250-metrics-certs\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.991824 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njbjj\" (UniqueName: \"kubernetes.io/projected/2cf110f6-e70a-45af-a634-744262733250-kube-api-access-njbjj\") pod \"frr-k8s-5ls9t\" (UID: \"2cf110f6-e70a-45af-a634-744262733250\") " pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:52 crc kubenswrapper[4804]: I0217 13:41:52.992842 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hllj\" (UniqueName: \"kubernetes.io/projected/0d003d1c-2370-4291-a035-0ebe8b97cfee-kube-api-access-2hllj\") pod \"frr-k8s-webhook-server-78b44bf5bb-gl8tp\" (UID: \"0d003d1c-2370-4291-a035-0ebe8b97cfee\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gl8tp" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.004695 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-wg4pd"] Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.072099 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ef60181c-19a6-454c-a197-2b0af0ac2edf-metallb-excludel2\") pod \"speaker-wrsrf\" (UID: \"ef60181c-19a6-454c-a197-2b0af0ac2edf\") " pod="metallb-system/speaker-wrsrf" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.072141 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef60181c-19a6-454c-a197-2b0af0ac2edf-metrics-certs\") pod \"speaker-wrsrf\" (UID: \"ef60181c-19a6-454c-a197-2b0af0ac2edf\") " pod="metallb-system/speaker-wrsrf" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.072174 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ef60181c-19a6-454c-a197-2b0af0ac2edf-memberlist\") pod \"speaker-wrsrf\" (UID: \"ef60181c-19a6-454c-a197-2b0af0ac2edf\") " pod="metallb-system/speaker-wrsrf" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.072209 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01625c42-e1b1-470d-b705-47b30fec457a-cert\") pod \"controller-69bbfbf88f-wg4pd\" (UID: \"01625c42-e1b1-470d-b705-47b30fec457a\") " pod="metallb-system/controller-69bbfbf88f-wg4pd" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.072259 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjxxq\" (UniqueName: \"kubernetes.io/projected/01625c42-e1b1-470d-b705-47b30fec457a-kube-api-access-jjxxq\") pod \"controller-69bbfbf88f-wg4pd\" (UID: \"01625c42-e1b1-470d-b705-47b30fec457a\") " pod="metallb-system/controller-69bbfbf88f-wg4pd" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.072283 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01625c42-e1b1-470d-b705-47b30fec457a-metrics-certs\") pod \"controller-69bbfbf88f-wg4pd\" (UID: \"01625c42-e1b1-470d-b705-47b30fec457a\") " pod="metallb-system/controller-69bbfbf88f-wg4pd" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.072300 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhngs\" (UniqueName: \"kubernetes.io/projected/ef60181c-19a6-454c-a197-2b0af0ac2edf-kube-api-access-rhngs\") pod \"speaker-wrsrf\" (UID: \"ef60181c-19a6-454c-a197-2b0af0ac2edf\") " pod="metallb-system/speaker-wrsrf" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.073138 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ef60181c-19a6-454c-a197-2b0af0ac2edf-metallb-excludel2\") pod \"speaker-wrsrf\" (UID: \"ef60181c-19a6-454c-a197-2b0af0ac2edf\") " pod="metallb-system/speaker-wrsrf" Feb 17 13:41:53 crc kubenswrapper[4804]: E0217 13:41:53.073599 4804 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 17 13:41:53 crc kubenswrapper[4804]: E0217 13:41:53.073646 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef60181c-19a6-454c-a197-2b0af0ac2edf-memberlist podName:ef60181c-19a6-454c-a197-2b0af0ac2edf nodeName:}" failed. No retries permitted until 2026-02-17 13:41:53.573631606 +0000 UTC m=+987.685050943 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/ef60181c-19a6-454c-a197-2b0af0ac2edf-memberlist") pod "speaker-wrsrf" (UID: "ef60181c-19a6-454c-a197-2b0af0ac2edf") : secret "metallb-memberlist" not found Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.076141 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef60181c-19a6-454c-a197-2b0af0ac2edf-metrics-certs\") pod \"speaker-wrsrf\" (UID: \"ef60181c-19a6-454c-a197-2b0af0ac2edf\") " pod="metallb-system/speaker-wrsrf" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.095775 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhngs\" (UniqueName: \"kubernetes.io/projected/ef60181c-19a6-454c-a197-2b0af0ac2edf-kube-api-access-rhngs\") pod \"speaker-wrsrf\" (UID: \"ef60181c-19a6-454c-a197-2b0af0ac2edf\") " pod="metallb-system/speaker-wrsrf" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.172672 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01625c42-e1b1-470d-b705-47b30fec457a-cert\") pod \"controller-69bbfbf88f-wg4pd\" (UID: \"01625c42-e1b1-470d-b705-47b30fec457a\") " pod="metallb-system/controller-69bbfbf88f-wg4pd" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.172777 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjxxq\" (UniqueName: \"kubernetes.io/projected/01625c42-e1b1-470d-b705-47b30fec457a-kube-api-access-jjxxq\") pod \"controller-69bbfbf88f-wg4pd\" (UID: \"01625c42-e1b1-470d-b705-47b30fec457a\") " pod="metallb-system/controller-69bbfbf88f-wg4pd" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.172803 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01625c42-e1b1-470d-b705-47b30fec457a-metrics-certs\") pod \"controller-69bbfbf88f-wg4pd\" (UID: \"01625c42-e1b1-470d-b705-47b30fec457a\") " pod="metallb-system/controller-69bbfbf88f-wg4pd" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.174228 4804 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.176948 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01625c42-e1b1-470d-b705-47b30fec457a-metrics-certs\") pod \"controller-69bbfbf88f-wg4pd\" (UID: \"01625c42-e1b1-470d-b705-47b30fec457a\") " pod="metallb-system/controller-69bbfbf88f-wg4pd" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.181303 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.186577 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01625c42-e1b1-470d-b705-47b30fec457a-cert\") pod \"controller-69bbfbf88f-wg4pd\" (UID: \"01625c42-e1b1-470d-b705-47b30fec457a\") " pod="metallb-system/controller-69bbfbf88f-wg4pd" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.187418 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjxxq\" (UniqueName: \"kubernetes.io/projected/01625c42-e1b1-470d-b705-47b30fec457a-kube-api-access-jjxxq\") pod \"controller-69bbfbf88f-wg4pd\" (UID: \"01625c42-e1b1-470d-b705-47b30fec457a\") " pod="metallb-system/controller-69bbfbf88f-wg4pd" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.343673 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-wg4pd" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.478080 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d003d1c-2370-4291-a035-0ebe8b97cfee-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-gl8tp\" (UID: \"0d003d1c-2370-4291-a035-0ebe8b97cfee\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gl8tp" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.487475 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d003d1c-2370-4291-a035-0ebe8b97cfee-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-gl8tp\" (UID: \"0d003d1c-2370-4291-a035-0ebe8b97cfee\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gl8tp" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.492493 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gl8tp" Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.579844 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ef60181c-19a6-454c-a197-2b0af0ac2edf-memberlist\") pod \"speaker-wrsrf\" (UID: \"ef60181c-19a6-454c-a197-2b0af0ac2edf\") " pod="metallb-system/speaker-wrsrf" Feb 17 13:41:53 crc kubenswrapper[4804]: E0217 13:41:53.580060 4804 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 17 13:41:53 crc kubenswrapper[4804]: E0217 13:41:53.580110 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef60181c-19a6-454c-a197-2b0af0ac2edf-memberlist podName:ef60181c-19a6-454c-a197-2b0af0ac2edf nodeName:}" failed. No retries permitted until 2026-02-17 13:41:54.580095971 +0000 UTC m=+988.691515308 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/ef60181c-19a6-454c-a197-2b0af0ac2edf-memberlist") pod "speaker-wrsrf" (UID: "ef60181c-19a6-454c-a197-2b0af0ac2edf") : secret "metallb-memberlist" not found Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.722916 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5ls9t" event={"ID":"2cf110f6-e70a-45af-a634-744262733250","Type":"ContainerStarted","Data":"3c45f79d8d33e0ffd9427a2eaba1620cde9b11dcc3b49d408e4a8dbea30ad617"} Feb 17 13:41:53 crc kubenswrapper[4804]: I0217 13:41:53.988383 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-wg4pd"] Feb 17 13:41:54 crc kubenswrapper[4804]: I0217 13:41:54.363887 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-gl8tp"] Feb 17 13:41:54 crc kubenswrapper[4804]: W0217 13:41:54.370862 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d003d1c_2370_4291_a035_0ebe8b97cfee.slice/crio-f96b60bcad8588e9e3fc182da1503fc0a272f815c284e6e6668222bfaaa2960e WatchSource:0}: Error finding container f96b60bcad8588e9e3fc182da1503fc0a272f815c284e6e6668222bfaaa2960e: Status 404 returned error can't find the container with id f96b60bcad8588e9e3fc182da1503fc0a272f815c284e6e6668222bfaaa2960e Feb 17 13:41:54 crc kubenswrapper[4804]: I0217 13:41:54.594794 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ef60181c-19a6-454c-a197-2b0af0ac2edf-memberlist\") pod \"speaker-wrsrf\" (UID: \"ef60181c-19a6-454c-a197-2b0af0ac2edf\") " pod="metallb-system/speaker-wrsrf" Feb 17 13:41:54 crc kubenswrapper[4804]: I0217 13:41:54.602971 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ef60181c-19a6-454c-a197-2b0af0ac2edf-memberlist\") pod \"speaker-wrsrf\" (UID: \"ef60181c-19a6-454c-a197-2b0af0ac2edf\") " pod="metallb-system/speaker-wrsrf" Feb 17 13:41:54 crc kubenswrapper[4804]: I0217 13:41:54.728527 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gl8tp" event={"ID":"0d003d1c-2370-4291-a035-0ebe8b97cfee","Type":"ContainerStarted","Data":"f96b60bcad8588e9e3fc182da1503fc0a272f815c284e6e6668222bfaaa2960e"} Feb 17 13:41:54 crc kubenswrapper[4804]: I0217 13:41:54.730635 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-wg4pd" event={"ID":"01625c42-e1b1-470d-b705-47b30fec457a","Type":"ContainerStarted","Data":"c09b74cc048110353c52f9aae1096e4c8674260ad5f2dcd3da1eda135e5eef04"} Feb 17 13:41:54 crc kubenswrapper[4804]: I0217 13:41:54.731350 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-69bbfbf88f-wg4pd" Feb 17 13:41:54 crc kubenswrapper[4804]: I0217 13:41:54.731437 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-wg4pd" event={"ID":"01625c42-e1b1-470d-b705-47b30fec457a","Type":"ContainerStarted","Data":"845e848caa2f8b769d7df1a66ad9e3a5c70a09490ed50bcad12a8bb787f88215"} Feb 17 13:41:54 crc kubenswrapper[4804]: I0217 13:41:54.731506 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-wg4pd" event={"ID":"01625c42-e1b1-470d-b705-47b30fec457a","Type":"ContainerStarted","Data":"ad2b7d78a8e8dee83f207f1660f4f8d5ffa09cb48dad1685ebde4a4db2e3d411"} Feb 17 13:41:54 crc kubenswrapper[4804]: I0217 13:41:54.756482 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-69bbfbf88f-wg4pd" podStartSLOduration=2.7564559600000003 podStartE2EDuration="2.75645596s" podCreationTimestamp="2026-02-17 13:41:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:41:54.746796419 +0000 UTC m=+988.858215766" watchObservedRunningTime="2026-02-17 13:41:54.75645596 +0000 UTC m=+988.867875297" Feb 17 13:41:54 crc kubenswrapper[4804]: I0217 13:41:54.757759 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-wrsrf" Feb 17 13:41:54 crc kubenswrapper[4804]: W0217 13:41:54.782489 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef60181c_19a6_454c_a197_2b0af0ac2edf.slice/crio-4b5785b317d5ca6368f1c79b70ee67c3f48131643dfdd643ba8b1d6edcfeb520 WatchSource:0}: Error finding container 4b5785b317d5ca6368f1c79b70ee67c3f48131643dfdd643ba8b1d6edcfeb520: Status 404 returned error can't find the container with id 4b5785b317d5ca6368f1c79b70ee67c3f48131643dfdd643ba8b1d6edcfeb520 Feb 17 13:41:55 crc kubenswrapper[4804]: I0217 13:41:55.739162 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wrsrf" event={"ID":"ef60181c-19a6-454c-a197-2b0af0ac2edf","Type":"ContainerStarted","Data":"e250d36173e404a0945e19572b7e543c77be4f07a974ac9da4a8b694951defb3"} Feb 17 13:41:55 crc kubenswrapper[4804]: I0217 13:41:55.739503 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wrsrf" event={"ID":"ef60181c-19a6-454c-a197-2b0af0ac2edf","Type":"ContainerStarted","Data":"42c6823e8bc58c4e3d1883cbddac301df8020c586d1c3684c9adee09bcd76554"} Feb 17 13:41:55 crc kubenswrapper[4804]: I0217 13:41:55.739517 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wrsrf" event={"ID":"ef60181c-19a6-454c-a197-2b0af0ac2edf","Type":"ContainerStarted","Data":"4b5785b317d5ca6368f1c79b70ee67c3f48131643dfdd643ba8b1d6edcfeb520"} Feb 17 13:41:55 crc kubenswrapper[4804]: I0217 13:41:55.739904 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-wrsrf" Feb 17 13:41:55 crc kubenswrapper[4804]: I0217 13:41:55.765224 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-wrsrf" podStartSLOduration=3.7651908880000002 podStartE2EDuration="3.765190888s" podCreationTimestamp="2026-02-17 13:41:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:41:55.762985039 +0000 UTC m=+989.874404386" watchObservedRunningTime="2026-02-17 13:41:55.765190888 +0000 UTC m=+989.876610225" Feb 17 13:41:55 crc kubenswrapper[4804]: I0217 13:41:55.835911 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:41:55 crc kubenswrapper[4804]: I0217 13:41:55.835986 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:42:03 crc kubenswrapper[4804]: I0217 13:42:03.840472 4804 generic.go:334] "Generic (PLEG): container finished" podID="2cf110f6-e70a-45af-a634-744262733250" containerID="31a91661ad613675d481d75b9eb1010b10af1c6ecc85489ed53a7783cb7723a2" exitCode=0 Feb 17 13:42:03 crc kubenswrapper[4804]: I0217 13:42:03.840946 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5ls9t" event={"ID":"2cf110f6-e70a-45af-a634-744262733250","Type":"ContainerDied","Data":"31a91661ad613675d481d75b9eb1010b10af1c6ecc85489ed53a7783cb7723a2"} Feb 17 13:42:03 crc kubenswrapper[4804]: I0217 13:42:03.843164 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gl8tp" event={"ID":"0d003d1c-2370-4291-a035-0ebe8b97cfee","Type":"ContainerStarted","Data":"9cbd5bc77080f3176cb4b17720ecda8736208c9c38a000cb824af2f7c6983de3"} Feb 17 13:42:03 crc kubenswrapper[4804]: I0217 13:42:03.843331 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gl8tp" Feb 17 13:42:03 crc kubenswrapper[4804]: I0217 13:42:03.886185 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gl8tp" podStartSLOduration=3.550912601 podStartE2EDuration="11.886163077s" podCreationTimestamp="2026-02-17 13:41:52 +0000 UTC" firstStartedPulling="2026-02-17 13:41:54.374714387 +0000 UTC m=+988.486133734" lastFinishedPulling="2026-02-17 13:42:02.709964873 +0000 UTC m=+996.821384210" observedRunningTime="2026-02-17 13:42:03.881012247 +0000 UTC m=+997.992431584" watchObservedRunningTime="2026-02-17 13:42:03.886163077 +0000 UTC m=+997.997582424" Feb 17 13:42:04 crc kubenswrapper[4804]: I0217 13:42:04.853621 4804 generic.go:334] "Generic (PLEG): container finished" podID="2cf110f6-e70a-45af-a634-744262733250" containerID="286a37042ccecc7feae021e7d3d35a3a25c5461d1f7a95649e636d69ec398d1c" exitCode=0 Feb 17 13:42:04 crc kubenswrapper[4804]: I0217 13:42:04.853852 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5ls9t" event={"ID":"2cf110f6-e70a-45af-a634-744262733250","Type":"ContainerDied","Data":"286a37042ccecc7feae021e7d3d35a3a25c5461d1f7a95649e636d69ec398d1c"} Feb 17 13:42:05 crc kubenswrapper[4804]: I0217 13:42:05.866280 4804 generic.go:334] "Generic (PLEG): container finished" podID="2cf110f6-e70a-45af-a634-744262733250" containerID="d183889706e6c6c384274a90a2714435f1701f94a432b836c8f9c14f439d512b" exitCode=0 Feb 17 13:42:05 crc kubenswrapper[4804]: I0217 13:42:05.866345 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5ls9t" event={"ID":"2cf110f6-e70a-45af-a634-744262733250","Type":"ContainerDied","Data":"d183889706e6c6c384274a90a2714435f1701f94a432b836c8f9c14f439d512b"} Feb 17 13:42:06 crc kubenswrapper[4804]: I0217 13:42:06.878490 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5ls9t" event={"ID":"2cf110f6-e70a-45af-a634-744262733250","Type":"ContainerStarted","Data":"8a5c15369198d50df5e850b53f78f17fbe8c70b3c65ec19fddcb9ee2117886ac"} Feb 17 13:42:06 crc kubenswrapper[4804]: I0217 13:42:06.878846 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5ls9t" event={"ID":"2cf110f6-e70a-45af-a634-744262733250","Type":"ContainerStarted","Data":"a029044cfab50e217f50db0721984bbf20ff684705b44ebae942d30c10b54c68"} Feb 17 13:42:06 crc kubenswrapper[4804]: I0217 13:42:06.878860 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5ls9t" event={"ID":"2cf110f6-e70a-45af-a634-744262733250","Type":"ContainerStarted","Data":"f9df17c01d36427bb662abaff76e285be5e09e55319387c314d10a038cca9e47"} Feb 17 13:42:06 crc kubenswrapper[4804]: I0217 13:42:06.878872 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5ls9t" event={"ID":"2cf110f6-e70a-45af-a634-744262733250","Type":"ContainerStarted","Data":"774adec7ad193a8c0096330652f3c3ed1acee59c66563bc91df03ca73d822d7c"} Feb 17 13:42:06 crc kubenswrapper[4804]: I0217 13:42:06.878884 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5ls9t" event={"ID":"2cf110f6-e70a-45af-a634-744262733250","Type":"ContainerStarted","Data":"a0535b85fbdd1d49661d3783776df228ddffff505a184606493b94f135df0702"} Feb 17 13:42:07 crc kubenswrapper[4804]: I0217 13:42:07.892700 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5ls9t" event={"ID":"2cf110f6-e70a-45af-a634-744262733250","Type":"ContainerStarted","Data":"c8180c1f95b5adc892185b1b075d9d3853e0d01e17952fb55475654faebc2634"} Feb 17 13:42:07 crc kubenswrapper[4804]: I0217 13:42:07.893074 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:42:07 crc kubenswrapper[4804]: I0217 13:42:07.928070 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-5ls9t" podStartSLOduration=6.517898868 podStartE2EDuration="15.928046557s" podCreationTimestamp="2026-02-17 13:41:52 +0000 UTC" firstStartedPulling="2026-02-17 13:41:53.281784062 +0000 UTC m=+987.393203439" lastFinishedPulling="2026-02-17 13:42:02.691931791 +0000 UTC m=+996.803351128" observedRunningTime="2026-02-17 13:42:07.92171198 +0000 UTC m=+1002.033131397" watchObservedRunningTime="2026-02-17 13:42:07.928046557 +0000 UTC m=+1002.039465934" Feb 17 13:42:08 crc kubenswrapper[4804]: I0217 13:42:08.181907 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:42:08 crc kubenswrapper[4804]: I0217 13:42:08.220702 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:42:13 crc kubenswrapper[4804]: I0217 13:42:13.351952 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-69bbfbf88f-wg4pd" Feb 17 13:42:13 crc kubenswrapper[4804]: I0217 13:42:13.497781 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gl8tp" Feb 17 13:42:14 crc kubenswrapper[4804]: I0217 13:42:14.763326 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-wrsrf" Feb 17 13:42:20 crc kubenswrapper[4804]: I0217 13:42:20.925465 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-55nc6"] Feb 17 13:42:20 crc kubenswrapper[4804]: I0217 13:42:20.926958 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-55nc6" Feb 17 13:42:20 crc kubenswrapper[4804]: I0217 13:42:20.928596 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-79v2d" Feb 17 13:42:20 crc kubenswrapper[4804]: I0217 13:42:20.929278 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 17 13:42:20 crc kubenswrapper[4804]: I0217 13:42:20.933169 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 17 13:42:20 crc kubenswrapper[4804]: I0217 13:42:20.940689 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-55nc6"] Feb 17 13:42:21 crc kubenswrapper[4804]: I0217 13:42:21.014810 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmj92\" (UniqueName: \"kubernetes.io/projected/13d9e436-3cb0-4df0-aaf9-e614eba74c89-kube-api-access-cmj92\") pod \"openstack-operator-index-55nc6\" (UID: \"13d9e436-3cb0-4df0-aaf9-e614eba74c89\") " pod="openstack-operators/openstack-operator-index-55nc6" Feb 17 13:42:21 crc kubenswrapper[4804]: I0217 13:42:21.116075 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmj92\" (UniqueName: \"kubernetes.io/projected/13d9e436-3cb0-4df0-aaf9-e614eba74c89-kube-api-access-cmj92\") pod \"openstack-operator-index-55nc6\" (UID: \"13d9e436-3cb0-4df0-aaf9-e614eba74c89\") " pod="openstack-operators/openstack-operator-index-55nc6" Feb 17 13:42:21 crc kubenswrapper[4804]: I0217 13:42:21.142476 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmj92\" (UniqueName: \"kubernetes.io/projected/13d9e436-3cb0-4df0-aaf9-e614eba74c89-kube-api-access-cmj92\") pod \"openstack-operator-index-55nc6\" (UID: \"13d9e436-3cb0-4df0-aaf9-e614eba74c89\") " pod="openstack-operators/openstack-operator-index-55nc6" Feb 17 13:42:21 crc kubenswrapper[4804]: I0217 13:42:21.245473 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-55nc6" Feb 17 13:42:21 crc kubenswrapper[4804]: I0217 13:42:21.697923 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-55nc6"] Feb 17 13:42:21 crc kubenswrapper[4804]: I0217 13:42:21.997809 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-55nc6" event={"ID":"13d9e436-3cb0-4df0-aaf9-e614eba74c89","Type":"ContainerStarted","Data":"861d1c69d6a8221ff3032e9b5c4ea80bff43cdfd3c764102c5d643c7cc5ce89c"} Feb 17 13:42:23 crc kubenswrapper[4804]: I0217 13:42:23.184603 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-5ls9t" Feb 17 13:42:25 crc kubenswrapper[4804]: I0217 13:42:25.016142 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-55nc6" event={"ID":"13d9e436-3cb0-4df0-aaf9-e614eba74c89","Type":"ContainerStarted","Data":"a5f9f93ea4da96eee98bfd46ea36bb4e837a791f14d21668403ddf6cb911e961"} Feb 17 13:42:25 crc kubenswrapper[4804]: I0217 13:42:25.035436 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-55nc6" podStartSLOduration=2.307146188 podStartE2EDuration="5.035417136s" podCreationTimestamp="2026-02-17 13:42:20 +0000 UTC" firstStartedPulling="2026-02-17 13:42:21.709749956 +0000 UTC m=+1015.821169293" lastFinishedPulling="2026-02-17 13:42:24.438020914 +0000 UTC m=+1018.549440241" observedRunningTime="2026-02-17 13:42:25.031453272 +0000 UTC m=+1019.142872609" watchObservedRunningTime="2026-02-17 13:42:25.035417136 +0000 UTC m=+1019.146836493" Feb 17 13:42:25 crc kubenswrapper[4804]: I0217 13:42:25.835049 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:42:25 crc kubenswrapper[4804]: I0217 13:42:25.835122 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:42:31 crc kubenswrapper[4804]: I0217 13:42:31.246156 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-55nc6" Feb 17 13:42:31 crc kubenswrapper[4804]: I0217 13:42:31.247372 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-55nc6" Feb 17 13:42:31 crc kubenswrapper[4804]: I0217 13:42:31.268076 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-55nc6" Feb 17 13:42:32 crc kubenswrapper[4804]: I0217 13:42:32.154275 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-55nc6" Feb 17 13:42:37 crc kubenswrapper[4804]: I0217 13:42:37.834971 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq"] Feb 17 13:42:37 crc kubenswrapper[4804]: I0217 13:42:37.837656 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq" Feb 17 13:42:37 crc kubenswrapper[4804]: I0217 13:42:37.841536 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq"] Feb 17 13:42:37 crc kubenswrapper[4804]: I0217 13:42:37.841847 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-hjhwq" Feb 17 13:42:37 crc kubenswrapper[4804]: I0217 13:42:37.858061 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtz5g\" (UniqueName: \"kubernetes.io/projected/fc2739bc-c729-4c9f-856b-9a08143fc359-kube-api-access-wtz5g\") pod \"69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq\" (UID: \"fc2739bc-c729-4c9f-856b-9a08143fc359\") " pod="openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq" Feb 17 13:42:37 crc kubenswrapper[4804]: I0217 13:42:37.858136 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fc2739bc-c729-4c9f-856b-9a08143fc359-util\") pod \"69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq\" (UID: \"fc2739bc-c729-4c9f-856b-9a08143fc359\") " pod="openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq" Feb 17 13:42:37 crc kubenswrapper[4804]: I0217 13:42:37.858277 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fc2739bc-c729-4c9f-856b-9a08143fc359-bundle\") pod \"69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq\" (UID: \"fc2739bc-c729-4c9f-856b-9a08143fc359\") " pod="openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq" Feb 17 13:42:37 crc kubenswrapper[4804]: I0217 13:42:37.960080 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtz5g\" (UniqueName: \"kubernetes.io/projected/fc2739bc-c729-4c9f-856b-9a08143fc359-kube-api-access-wtz5g\") pod \"69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq\" (UID: \"fc2739bc-c729-4c9f-856b-9a08143fc359\") " pod="openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq" Feb 17 13:42:37 crc kubenswrapper[4804]: I0217 13:42:37.960182 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fc2739bc-c729-4c9f-856b-9a08143fc359-util\") pod \"69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq\" (UID: \"fc2739bc-c729-4c9f-856b-9a08143fc359\") " pod="openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq" Feb 17 13:42:37 crc kubenswrapper[4804]: I0217 13:42:37.960274 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fc2739bc-c729-4c9f-856b-9a08143fc359-bundle\") pod \"69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq\" (UID: \"fc2739bc-c729-4c9f-856b-9a08143fc359\") " pod="openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq" Feb 17 13:42:37 crc kubenswrapper[4804]: I0217 13:42:37.961053 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fc2739bc-c729-4c9f-856b-9a08143fc359-bundle\") pod \"69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq\" (UID: \"fc2739bc-c729-4c9f-856b-9a08143fc359\") " pod="openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq" Feb 17 13:42:37 crc kubenswrapper[4804]: I0217 13:42:37.961254 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fc2739bc-c729-4c9f-856b-9a08143fc359-util\") pod \"69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq\" (UID: \"fc2739bc-c729-4c9f-856b-9a08143fc359\") " pod="openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq" Feb 17 13:42:37 crc kubenswrapper[4804]: I0217 13:42:37.983834 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtz5g\" (UniqueName: \"kubernetes.io/projected/fc2739bc-c729-4c9f-856b-9a08143fc359-kube-api-access-wtz5g\") pod \"69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq\" (UID: \"fc2739bc-c729-4c9f-856b-9a08143fc359\") " pod="openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq" Feb 17 13:42:38 crc kubenswrapper[4804]: I0217 13:42:38.156927 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq" Feb 17 13:42:38 crc kubenswrapper[4804]: I0217 13:42:38.619740 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq"] Feb 17 13:42:38 crc kubenswrapper[4804]: W0217 13:42:38.628163 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc2739bc_c729_4c9f_856b_9a08143fc359.slice/crio-ec7d0ee7f8c7dd1d27b502270de26941298513be5ef685346b7fd8df50fab105 WatchSource:0}: Error finding container ec7d0ee7f8c7dd1d27b502270de26941298513be5ef685346b7fd8df50fab105: Status 404 returned error can't find the container with id ec7d0ee7f8c7dd1d27b502270de26941298513be5ef685346b7fd8df50fab105 Feb 17 13:42:39 crc kubenswrapper[4804]: E0217 13:42:39.000762 4804 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc2739bc_c729_4c9f_856b_9a08143fc359.slice/crio-conmon-b15b4e31d6288ad5c02820211ee924900b2037c0db297292061f024011e20eb0.scope\": RecentStats: unable to find data in memory cache]" Feb 17 13:42:39 crc kubenswrapper[4804]: I0217 13:42:39.164698 4804 generic.go:334] "Generic (PLEG): container finished" podID="fc2739bc-c729-4c9f-856b-9a08143fc359" containerID="b15b4e31d6288ad5c02820211ee924900b2037c0db297292061f024011e20eb0" exitCode=0 Feb 17 13:42:39 crc kubenswrapper[4804]: I0217 13:42:39.164743 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq" event={"ID":"fc2739bc-c729-4c9f-856b-9a08143fc359","Type":"ContainerDied","Data":"b15b4e31d6288ad5c02820211ee924900b2037c0db297292061f024011e20eb0"} Feb 17 13:42:39 crc kubenswrapper[4804]: I0217 13:42:39.164768 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq" event={"ID":"fc2739bc-c729-4c9f-856b-9a08143fc359","Type":"ContainerStarted","Data":"ec7d0ee7f8c7dd1d27b502270de26941298513be5ef685346b7fd8df50fab105"} Feb 17 13:42:40 crc kubenswrapper[4804]: I0217 13:42:40.174590 4804 generic.go:334] "Generic (PLEG): container finished" podID="fc2739bc-c729-4c9f-856b-9a08143fc359" containerID="39408a9beaa99c4209df24118356ddcb6bea1315c96157a9d6f36ce235dcb210" exitCode=0 Feb 17 13:42:40 crc kubenswrapper[4804]: I0217 13:42:40.174881 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq" event={"ID":"fc2739bc-c729-4c9f-856b-9a08143fc359","Type":"ContainerDied","Data":"39408a9beaa99c4209df24118356ddcb6bea1315c96157a9d6f36ce235dcb210"} Feb 17 13:42:41 crc kubenswrapper[4804]: I0217 13:42:41.184684 4804 generic.go:334] "Generic (PLEG): container finished" podID="fc2739bc-c729-4c9f-856b-9a08143fc359" containerID="aeb77e12a460173d6ba6457217034163673580ddc92d9330909fc77828824bae" exitCode=0 Feb 17 13:42:41 crc kubenswrapper[4804]: I0217 13:42:41.184738 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq" event={"ID":"fc2739bc-c729-4c9f-856b-9a08143fc359","Type":"ContainerDied","Data":"aeb77e12a460173d6ba6457217034163673580ddc92d9330909fc77828824bae"} Feb 17 13:42:42 crc kubenswrapper[4804]: I0217 13:42:42.432019 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq" Feb 17 13:42:42 crc kubenswrapper[4804]: I0217 13:42:42.622778 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fc2739bc-c729-4c9f-856b-9a08143fc359-bundle\") pod \"fc2739bc-c729-4c9f-856b-9a08143fc359\" (UID: \"fc2739bc-c729-4c9f-856b-9a08143fc359\") " Feb 17 13:42:42 crc kubenswrapper[4804]: I0217 13:42:42.623677 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtz5g\" (UniqueName: \"kubernetes.io/projected/fc2739bc-c729-4c9f-856b-9a08143fc359-kube-api-access-wtz5g\") pod \"fc2739bc-c729-4c9f-856b-9a08143fc359\" (UID: \"fc2739bc-c729-4c9f-856b-9a08143fc359\") " Feb 17 13:42:42 crc kubenswrapper[4804]: I0217 13:42:42.623776 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fc2739bc-c729-4c9f-856b-9a08143fc359-util\") pod \"fc2739bc-c729-4c9f-856b-9a08143fc359\" (UID: \"fc2739bc-c729-4c9f-856b-9a08143fc359\") " Feb 17 13:42:42 crc kubenswrapper[4804]: I0217 13:42:42.623771 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc2739bc-c729-4c9f-856b-9a08143fc359-bundle" (OuterVolumeSpecName: "bundle") pod "fc2739bc-c729-4c9f-856b-9a08143fc359" (UID: "fc2739bc-c729-4c9f-856b-9a08143fc359"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:42:42 crc kubenswrapper[4804]: I0217 13:42:42.624452 4804 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fc2739bc-c729-4c9f-856b-9a08143fc359-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:42:42 crc kubenswrapper[4804]: I0217 13:42:42.629864 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc2739bc-c729-4c9f-856b-9a08143fc359-kube-api-access-wtz5g" (OuterVolumeSpecName: "kube-api-access-wtz5g") pod "fc2739bc-c729-4c9f-856b-9a08143fc359" (UID: "fc2739bc-c729-4c9f-856b-9a08143fc359"). InnerVolumeSpecName "kube-api-access-wtz5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:42:42 crc kubenswrapper[4804]: I0217 13:42:42.638391 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc2739bc-c729-4c9f-856b-9a08143fc359-util" (OuterVolumeSpecName: "util") pod "fc2739bc-c729-4c9f-856b-9a08143fc359" (UID: "fc2739bc-c729-4c9f-856b-9a08143fc359"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:42:42 crc kubenswrapper[4804]: I0217 13:42:42.727062 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtz5g\" (UniqueName: \"kubernetes.io/projected/fc2739bc-c729-4c9f-856b-9a08143fc359-kube-api-access-wtz5g\") on node \"crc\" DevicePath \"\"" Feb 17 13:42:42 crc kubenswrapper[4804]: I0217 13:42:42.727125 4804 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fc2739bc-c729-4c9f-856b-9a08143fc359-util\") on node \"crc\" DevicePath \"\"" Feb 17 13:42:43 crc kubenswrapper[4804]: I0217 13:42:43.199665 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq" event={"ID":"fc2739bc-c729-4c9f-856b-9a08143fc359","Type":"ContainerDied","Data":"ec7d0ee7f8c7dd1d27b502270de26941298513be5ef685346b7fd8df50fab105"} Feb 17 13:42:43 crc kubenswrapper[4804]: I0217 13:42:43.199701 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec7d0ee7f8c7dd1d27b502270de26941298513be5ef685346b7fd8df50fab105" Feb 17 13:42:43 crc kubenswrapper[4804]: I0217 13:42:43.199754 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq" Feb 17 13:42:44 crc kubenswrapper[4804]: I0217 13:42:44.667979 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-7cb8c4979f-kfx9x"] Feb 17 13:42:44 crc kubenswrapper[4804]: E0217 13:42:44.668258 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc2739bc-c729-4c9f-856b-9a08143fc359" containerName="pull" Feb 17 13:42:44 crc kubenswrapper[4804]: I0217 13:42:44.668271 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc2739bc-c729-4c9f-856b-9a08143fc359" containerName="pull" Feb 17 13:42:44 crc kubenswrapper[4804]: E0217 13:42:44.668279 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc2739bc-c729-4c9f-856b-9a08143fc359" containerName="util" Feb 17 13:42:44 crc kubenswrapper[4804]: I0217 13:42:44.668285 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc2739bc-c729-4c9f-856b-9a08143fc359" containerName="util" Feb 17 13:42:44 crc kubenswrapper[4804]: E0217 13:42:44.668305 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc2739bc-c729-4c9f-856b-9a08143fc359" containerName="extract" Feb 17 13:42:44 crc kubenswrapper[4804]: I0217 13:42:44.668311 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc2739bc-c729-4c9f-856b-9a08143fc359" containerName="extract" Feb 17 13:42:44 crc kubenswrapper[4804]: I0217 13:42:44.668410 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc2739bc-c729-4c9f-856b-9a08143fc359" containerName="extract" Feb 17 13:42:44 crc kubenswrapper[4804]: I0217 13:42:44.668784 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-7cb8c4979f-kfx9x" Feb 17 13:42:44 crc kubenswrapper[4804]: I0217 13:42:44.671652 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-vx6dt" Feb 17 13:42:44 crc kubenswrapper[4804]: I0217 13:42:44.686461 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-7cb8c4979f-kfx9x"] Feb 17 13:42:44 crc kubenswrapper[4804]: I0217 13:42:44.855457 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mrn7\" (UniqueName: \"kubernetes.io/projected/f69fc148-3a8b-4065-b075-85ecad8339e7-kube-api-access-6mrn7\") pod \"openstack-operator-controller-init-7cb8c4979f-kfx9x\" (UID: \"f69fc148-3a8b-4065-b075-85ecad8339e7\") " pod="openstack-operators/openstack-operator-controller-init-7cb8c4979f-kfx9x" Feb 17 13:42:44 crc kubenswrapper[4804]: I0217 13:42:44.956771 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mrn7\" (UniqueName: \"kubernetes.io/projected/f69fc148-3a8b-4065-b075-85ecad8339e7-kube-api-access-6mrn7\") pod \"openstack-operator-controller-init-7cb8c4979f-kfx9x\" (UID: \"f69fc148-3a8b-4065-b075-85ecad8339e7\") " pod="openstack-operators/openstack-operator-controller-init-7cb8c4979f-kfx9x" Feb 17 13:42:44 crc kubenswrapper[4804]: I0217 13:42:44.977251 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mrn7\" (UniqueName: \"kubernetes.io/projected/f69fc148-3a8b-4065-b075-85ecad8339e7-kube-api-access-6mrn7\") pod \"openstack-operator-controller-init-7cb8c4979f-kfx9x\" (UID: \"f69fc148-3a8b-4065-b075-85ecad8339e7\") " pod="openstack-operators/openstack-operator-controller-init-7cb8c4979f-kfx9x" Feb 17 13:42:44 crc kubenswrapper[4804]: I0217 13:42:44.987985 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-7cb8c4979f-kfx9x" Feb 17 13:42:45 crc kubenswrapper[4804]: I0217 13:42:45.211669 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-7cb8c4979f-kfx9x"] Feb 17 13:42:45 crc kubenswrapper[4804]: W0217 13:42:45.217282 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf69fc148_3a8b_4065_b075_85ecad8339e7.slice/crio-fa0c77523b5532345c885b45d22ca5481ec80a1c8b9d6d7e446cb2a5faf48b1b WatchSource:0}: Error finding container fa0c77523b5532345c885b45d22ca5481ec80a1c8b9d6d7e446cb2a5faf48b1b: Status 404 returned error can't find the container with id fa0c77523b5532345c885b45d22ca5481ec80a1c8b9d6d7e446cb2a5faf48b1b Feb 17 13:42:46 crc kubenswrapper[4804]: I0217 13:42:46.218185 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-7cb8c4979f-kfx9x" event={"ID":"f69fc148-3a8b-4065-b075-85ecad8339e7","Type":"ContainerStarted","Data":"fa0c77523b5532345c885b45d22ca5481ec80a1c8b9d6d7e446cb2a5faf48b1b"} Feb 17 13:42:50 crc kubenswrapper[4804]: I0217 13:42:50.260311 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-7cb8c4979f-kfx9x" event={"ID":"f69fc148-3a8b-4065-b075-85ecad8339e7","Type":"ContainerStarted","Data":"078b6f1e61c73fda49819c4d9e4a1fb2d364c25e76601611efec9ae7181342b0"} Feb 17 13:42:50 crc kubenswrapper[4804]: I0217 13:42:50.260968 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-7cb8c4979f-kfx9x" Feb 17 13:42:50 crc kubenswrapper[4804]: I0217 13:42:50.296275 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-7cb8c4979f-kfx9x" podStartSLOduration=2.0937269020000002 podStartE2EDuration="6.296260945s" podCreationTimestamp="2026-02-17 13:42:44 +0000 UTC" firstStartedPulling="2026-02-17 13:42:45.22104713 +0000 UTC m=+1039.332466467" lastFinishedPulling="2026-02-17 13:42:49.423581173 +0000 UTC m=+1043.535000510" observedRunningTime="2026-02-17 13:42:50.296094331 +0000 UTC m=+1044.407513678" watchObservedRunningTime="2026-02-17 13:42:50.296260945 +0000 UTC m=+1044.407680282" Feb 17 13:42:54 crc kubenswrapper[4804]: I0217 13:42:54.990695 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-7cb8c4979f-kfx9x" Feb 17 13:42:55 crc kubenswrapper[4804]: I0217 13:42:55.835500 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:42:55 crc kubenswrapper[4804]: I0217 13:42:55.835816 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:42:55 crc kubenswrapper[4804]: I0217 13:42:55.835863 4804 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:42:55 crc kubenswrapper[4804]: I0217 13:42:55.836426 4804 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0320866c2bb2dbd13ef711a6f5701e23927765988c8998787dbdeb879aaaaa69"} pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 13:42:55 crc kubenswrapper[4804]: I0217 13:42:55.836485 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" containerID="cri-o://0320866c2bb2dbd13ef711a6f5701e23927765988c8998787dbdeb879aaaaa69" gracePeriod=600 Feb 17 13:42:56 crc kubenswrapper[4804]: I0217 13:42:56.305741 4804 generic.go:334] "Generic (PLEG): container finished" podID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerID="0320866c2bb2dbd13ef711a6f5701e23927765988c8998787dbdeb879aaaaa69" exitCode=0 Feb 17 13:42:56 crc kubenswrapper[4804]: I0217 13:42:56.305785 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerDied","Data":"0320866c2bb2dbd13ef711a6f5701e23927765988c8998787dbdeb879aaaaa69"} Feb 17 13:42:56 crc kubenswrapper[4804]: I0217 13:42:56.306001 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerStarted","Data":"ea496523757895e32a1aaa278701aae787ee0314e5bf63e36e8c688fc2dbc0d7"} Feb 17 13:42:56 crc kubenswrapper[4804]: I0217 13:42:56.306034 4804 scope.go:117] "RemoveContainer" containerID="8de55925453e9c90c2dd998f586db937cfd6d8bf2a763548f6a43c49f5395c8e" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.468795 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-c4b7d6946-4xvfg"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.470639 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-4xvfg" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.473652 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-zvk85" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.479420 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-57746b5ff9-wn64m"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.480490 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-57746b5ff9-wn64m" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.485506 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-4dhxq" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.490394 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-c4b7d6946-4xvfg"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.497972 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-57746b5ff9-wn64m"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.545331 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-55cc45767f-bslfv"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.547075 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-55cc45767f-bslfv" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.549189 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxch4\" (UniqueName: \"kubernetes.io/projected/545c7d25-7774-4c62-89b8-f491fd4065e8-kube-api-access-xxch4\") pod \"barbican-operator-controller-manager-c4b7d6946-4xvfg\" (UID: \"545c7d25-7774-4c62-89b8-f491fd4065e8\") " pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-4xvfg" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.549355 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6npz\" (UniqueName: \"kubernetes.io/projected/0b746a42-c0b4-4cb9-9352-3623669bad5a-kube-api-access-t6npz\") pod \"cinder-operator-controller-manager-57746b5ff9-wn64m\" (UID: \"0b746a42-c0b4-4cb9-9352-3623669bad5a\") " pod="openstack-operators/cinder-operator-controller-manager-57746b5ff9-wn64m" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.553631 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw82v\" (UniqueName: \"kubernetes.io/projected/fbc5e6cd-47c6-4199-a0f2-e4292a836fac-kube-api-access-qw82v\") pod \"designate-operator-controller-manager-55cc45767f-bslfv\" (UID: \"fbc5e6cd-47c6-4199-a0f2-e4292a836fac\") " pod="openstack-operators/designate-operator-controller-manager-55cc45767f-bslfv" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.552756 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-m6ftc" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.565726 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-55cc45767f-bslfv"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.578174 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-68c6d499cb-vt6zw"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.579097 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-68c6d499cb-vt6zw" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.585467 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-96794" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.588893 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-68c6d499cb-vt6zw"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.596403 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-9595d6797-sxtr2"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.599320 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-9595d6797-sxtr2" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.602410 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-q7f6c" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.617696 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-54fb488b88-t6hlr"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.618645 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-t6hlr" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.624340 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-wxf2c" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.634152 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.635168 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.640117 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.643513 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-54fb488b88-t6hlr"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.648815 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-5zgr8" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.653886 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-9595d6797-sxtr2"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.656407 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qw82v\" (UniqueName: \"kubernetes.io/projected/fbc5e6cd-47c6-4199-a0f2-e4292a836fac-kube-api-access-qw82v\") pod \"designate-operator-controller-manager-55cc45767f-bslfv\" (UID: \"fbc5e6cd-47c6-4199-a0f2-e4292a836fac\") " pod="openstack-operators/designate-operator-controller-manager-55cc45767f-bslfv" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.656500 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74pjq\" (UniqueName: \"kubernetes.io/projected/5fa66dc5-a518-40dd-a4b5-dd2b34425ad5-kube-api-access-74pjq\") pod \"horizon-operator-controller-manager-54fb488b88-t6hlr\" (UID: \"5fa66dc5-a518-40dd-a4b5-dd2b34425ad5\") " pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-t6hlr" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.656589 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert\") pod \"infra-operator-controller-manager-66d6b5f488-lrjgg\" (UID: \"bf13099a-fbab-41bf-b30c-5c6b1049af19\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.656657 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8dcg\" (UniqueName: \"kubernetes.io/projected/bf13099a-fbab-41bf-b30c-5c6b1049af19-kube-api-access-b8dcg\") pod \"infra-operator-controller-manager-66d6b5f488-lrjgg\" (UID: \"bf13099a-fbab-41bf-b30c-5c6b1049af19\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.656726 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pmcl\" (UniqueName: \"kubernetes.io/projected/5796dc62-bd84-48b7-9c4c-7d5bf1f7e984-kube-api-access-6pmcl\") pod \"glance-operator-controller-manager-68c6d499cb-vt6zw\" (UID: \"5796dc62-bd84-48b7-9c4c-7d5bf1f7e984\") " pod="openstack-operators/glance-operator-controller-manager-68c6d499cb-vt6zw" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.656860 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxch4\" (UniqueName: \"kubernetes.io/projected/545c7d25-7774-4c62-89b8-f491fd4065e8-kube-api-access-xxch4\") pod \"barbican-operator-controller-manager-c4b7d6946-4xvfg\" (UID: \"545c7d25-7774-4c62-89b8-f491fd4065e8\") " pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-4xvfg" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.656896 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs55s\" (UniqueName: \"kubernetes.io/projected/5727ae12-4720-4470-b5cc-8b8ae81c2af7-kube-api-access-qs55s\") pod \"heat-operator-controller-manager-9595d6797-sxtr2\" (UID: \"5727ae12-4720-4470-b5cc-8b8ae81c2af7\") " pod="openstack-operators/heat-operator-controller-manager-9595d6797-sxtr2" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.656936 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6npz\" (UniqueName: \"kubernetes.io/projected/0b746a42-c0b4-4cb9-9352-3623669bad5a-kube-api-access-t6npz\") pod \"cinder-operator-controller-manager-57746b5ff9-wn64m\" (UID: \"0b746a42-c0b4-4cb9-9352-3623669bad5a\") " pod="openstack-operators/cinder-operator-controller-manager-57746b5ff9-wn64m" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.674847 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6494cdbf8f-cdpkr"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.675617 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-cdpkr" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.679594 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.690297 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6c78d668d5-pddsh"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.691496 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-pddsh" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.704408 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6494cdbf8f-cdpkr"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.709215 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-pgk8x" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.710827 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-f5pbb" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.714261 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6c78d668d5-pddsh"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.715394 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw82v\" (UniqueName: \"kubernetes.io/projected/fbc5e6cd-47c6-4199-a0f2-e4292a836fac-kube-api-access-qw82v\") pod \"designate-operator-controller-manager-55cc45767f-bslfv\" (UID: \"fbc5e6cd-47c6-4199-a0f2-e4292a836fac\") " pod="openstack-operators/designate-operator-controller-manager-55cc45767f-bslfv" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.723775 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-96fff9cb8-88sh4"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.724519 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-96fff9cb8-88sh4" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.732871 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-7tl8t" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.737573 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6npz\" (UniqueName: \"kubernetes.io/projected/0b746a42-c0b4-4cb9-9352-3623669bad5a-kube-api-access-t6npz\") pod \"cinder-operator-controller-manager-57746b5ff9-wn64m\" (UID: \"0b746a42-c0b4-4cb9-9352-3623669bad5a\") " pod="openstack-operators/cinder-operator-controller-manager-57746b5ff9-wn64m" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.744573 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-96fff9cb8-88sh4"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.745948 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxch4\" (UniqueName: \"kubernetes.io/projected/545c7d25-7774-4c62-89b8-f491fd4065e8-kube-api-access-xxch4\") pod \"barbican-operator-controller-manager-c4b7d6946-4xvfg\" (UID: \"545c7d25-7774-4c62-89b8-f491fd4065e8\") " pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-4xvfg" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.750095 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-66997756f6-vkdg2"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.751092 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-66997756f6-vkdg2" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.779759 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-wg7gs" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.780350 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54967dbbdf-l5cl2"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.780503 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8xc8\" (UniqueName: \"kubernetes.io/projected/430279ab-ba2f-4838-ab07-b851d4df84a0-kube-api-access-v8xc8\") pod \"keystone-operator-controller-manager-6c78d668d5-pddsh\" (UID: \"430279ab-ba2f-4838-ab07-b851d4df84a0\") " pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-pddsh" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.780557 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qs55s\" (UniqueName: \"kubernetes.io/projected/5727ae12-4720-4470-b5cc-8b8ae81c2af7-kube-api-access-qs55s\") pod \"heat-operator-controller-manager-9595d6797-sxtr2\" (UID: \"5727ae12-4720-4470-b5cc-8b8ae81c2af7\") " pod="openstack-operators/heat-operator-controller-manager-9595d6797-sxtr2" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.780588 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cst4d\" (UniqueName: \"kubernetes.io/projected/07b97973-fa08-4b79-9164-918a4d04f8b7-kube-api-access-cst4d\") pod \"ironic-operator-controller-manager-6494cdbf8f-cdpkr\" (UID: \"07b97973-fa08-4b79-9164-918a4d04f8b7\") " pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-cdpkr" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.780609 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74pjq\" (UniqueName: \"kubernetes.io/projected/5fa66dc5-a518-40dd-a4b5-dd2b34425ad5-kube-api-access-74pjq\") pod \"horizon-operator-controller-manager-54fb488b88-t6hlr\" (UID: \"5fa66dc5-a518-40dd-a4b5-dd2b34425ad5\") " pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-t6hlr" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.780673 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert\") pod \"infra-operator-controller-manager-66d6b5f488-lrjgg\" (UID: \"bf13099a-fbab-41bf-b30c-5c6b1049af19\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.780694 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw7hl\" (UniqueName: \"kubernetes.io/projected/d3332002-6930-418f-8288-e8344be70c6a-kube-api-access-mw7hl\") pod \"manila-operator-controller-manager-96fff9cb8-88sh4\" (UID: \"d3332002-6930-418f-8288-e8344be70c6a\") " pod="openstack-operators/manila-operator-controller-manager-96fff9cb8-88sh4" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.780719 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8dcg\" (UniqueName: \"kubernetes.io/projected/bf13099a-fbab-41bf-b30c-5c6b1049af19-kube-api-access-b8dcg\") pod \"infra-operator-controller-manager-66d6b5f488-lrjgg\" (UID: \"bf13099a-fbab-41bf-b30c-5c6b1049af19\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.780749 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pmcl\" (UniqueName: \"kubernetes.io/projected/5796dc62-bd84-48b7-9c4c-7d5bf1f7e984-kube-api-access-6pmcl\") pod \"glance-operator-controller-manager-68c6d499cb-vt6zw\" (UID: \"5796dc62-bd84-48b7-9c4c-7d5bf1f7e984\") " pod="openstack-operators/glance-operator-controller-manager-68c6d499cb-vt6zw" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.780767 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxvz2\" (UniqueName: \"kubernetes.io/projected/2546387a-6a42-4f8d-a321-2f9cbaa11adb-kube-api-access-bxvz2\") pod \"mariadb-operator-controller-manager-66997756f6-vkdg2\" (UID: \"2546387a-6a42-4f8d-a321-2f9cbaa11adb\") " pod="openstack-operators/mariadb-operator-controller-manager-66997756f6-vkdg2" Feb 17 13:43:15 crc kubenswrapper[4804]: E0217 13:43:15.781326 4804 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 13:43:15 crc kubenswrapper[4804]: E0217 13:43:15.781373 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert podName:bf13099a-fbab-41bf-b30c-5c6b1049af19 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:16.281356085 +0000 UTC m=+1070.392775422 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert") pod "infra-operator-controller-manager-66d6b5f488-lrjgg" (UID: "bf13099a-fbab-41bf-b30c-5c6b1049af19") : secret "infra-operator-webhook-server-cert" not found Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.781732 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-l5cl2" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.786233 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-4slfz" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.793478 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-4xvfg" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.798268 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5ddd85db87-c8hmm"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.799369 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5ddd85db87-c8hmm" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.806316 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-57746b5ff9-wn64m" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.806927 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-66997756f6-vkdg2"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.826641 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-fgl99" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.840101 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pmcl\" (UniqueName: \"kubernetes.io/projected/5796dc62-bd84-48b7-9c4c-7d5bf1f7e984-kube-api-access-6pmcl\") pod \"glance-operator-controller-manager-68c6d499cb-vt6zw\" (UID: \"5796dc62-bd84-48b7-9c4c-7d5bf1f7e984\") " pod="openstack-operators/glance-operator-controller-manager-68c6d499cb-vt6zw" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.840190 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54967dbbdf-l5cl2"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.845290 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74pjq\" (UniqueName: \"kubernetes.io/projected/5fa66dc5-a518-40dd-a4b5-dd2b34425ad5-kube-api-access-74pjq\") pod \"horizon-operator-controller-manager-54fb488b88-t6hlr\" (UID: \"5fa66dc5-a518-40dd-a4b5-dd2b34425ad5\") " pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-t6hlr" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.846004 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs55s\" (UniqueName: \"kubernetes.io/projected/5727ae12-4720-4470-b5cc-8b8ae81c2af7-kube-api-access-qs55s\") pod \"heat-operator-controller-manager-9595d6797-sxtr2\" (UID: \"5727ae12-4720-4470-b5cc-8b8ae81c2af7\") " pod="openstack-operators/heat-operator-controller-manager-9595d6797-sxtr2" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.881502 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8dcg\" (UniqueName: \"kubernetes.io/projected/bf13099a-fbab-41bf-b30c-5c6b1049af19-kube-api-access-b8dcg\") pod \"infra-operator-controller-manager-66d6b5f488-lrjgg\" (UID: \"bf13099a-fbab-41bf-b30c-5c6b1049af19\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.882071 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxvz2\" (UniqueName: \"kubernetes.io/projected/2546387a-6a42-4f8d-a321-2f9cbaa11adb-kube-api-access-bxvz2\") pod \"mariadb-operator-controller-manager-66997756f6-vkdg2\" (UID: \"2546387a-6a42-4f8d-a321-2f9cbaa11adb\") " pod="openstack-operators/mariadb-operator-controller-manager-66997756f6-vkdg2" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.882140 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8xc8\" (UniqueName: \"kubernetes.io/projected/430279ab-ba2f-4838-ab07-b851d4df84a0-kube-api-access-v8xc8\") pod \"keystone-operator-controller-manager-6c78d668d5-pddsh\" (UID: \"430279ab-ba2f-4838-ab07-b851d4df84a0\") " pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-pddsh" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.882184 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cst4d\" (UniqueName: \"kubernetes.io/projected/07b97973-fa08-4b79-9164-918a4d04f8b7-kube-api-access-cst4d\") pod \"ironic-operator-controller-manager-6494cdbf8f-cdpkr\" (UID: \"07b97973-fa08-4b79-9164-918a4d04f8b7\") " pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-cdpkr" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.882266 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw7hl\" (UniqueName: \"kubernetes.io/projected/d3332002-6930-418f-8288-e8344be70c6a-kube-api-access-mw7hl\") pod \"manila-operator-controller-manager-96fff9cb8-88sh4\" (UID: \"d3332002-6930-418f-8288-e8344be70c6a\") " pod="openstack-operators/manila-operator-controller-manager-96fff9cb8-88sh4" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.884970 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-55cc45767f-bslfv" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.900945 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5ddd85db87-c8hmm"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.904121 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-68c6d499cb-vt6zw" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.912255 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-745bbbd77b-ptrs5"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.913044 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-745bbbd77b-ptrs5" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.916686 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-9595d6797-sxtr2" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.918934 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-xcktr" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.937225 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw7hl\" (UniqueName: \"kubernetes.io/projected/d3332002-6930-418f-8288-e8344be70c6a-kube-api-access-mw7hl\") pod \"manila-operator-controller-manager-96fff9cb8-88sh4\" (UID: \"d3332002-6930-418f-8288-e8344be70c6a\") " pod="openstack-operators/manila-operator-controller-manager-96fff9cb8-88sh4" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.946172 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8xc8\" (UniqueName: \"kubernetes.io/projected/430279ab-ba2f-4838-ab07-b851d4df84a0-kube-api-access-v8xc8\") pod \"keystone-operator-controller-manager-6c78d668d5-pddsh\" (UID: \"430279ab-ba2f-4838-ab07-b851d4df84a0\") " pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-pddsh" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.948006 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxvz2\" (UniqueName: \"kubernetes.io/projected/2546387a-6a42-4f8d-a321-2f9cbaa11adb-kube-api-access-bxvz2\") pod \"mariadb-operator-controller-manager-66997756f6-vkdg2\" (UID: \"2546387a-6a42-4f8d-a321-2f9cbaa11adb\") " pod="openstack-operators/mariadb-operator-controller-manager-66997756f6-vkdg2" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.960471 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-66997756f6-vkdg2" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.962640 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-745bbbd77b-ptrs5"] Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.971618 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-t6hlr" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.984398 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls4zb\" (UniqueName: \"kubernetes.io/projected/79eb8fb0-6207-44c8-b3c2-a00116bcf10b-kube-api-access-ls4zb\") pod \"octavia-operator-controller-manager-745bbbd77b-ptrs5\" (UID: \"79eb8fb0-6207-44c8-b3c2-a00116bcf10b\") " pod="openstack-operators/octavia-operator-controller-manager-745bbbd77b-ptrs5" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.984512 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stpzs\" (UniqueName: \"kubernetes.io/projected/36b1ca46-becb-417e-b05e-777d40246cb6-kube-api-access-stpzs\") pod \"nova-operator-controller-manager-5ddd85db87-c8hmm\" (UID: \"36b1ca46-becb-417e-b05e-777d40246cb6\") " pod="openstack-operators/nova-operator-controller-manager-5ddd85db87-c8hmm" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.984612 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pcfl\" (UniqueName: \"kubernetes.io/projected/97925efc-eb46-4a60-b372-b31f13a2c876-kube-api-access-5pcfl\") pod \"neutron-operator-controller-manager-54967dbbdf-l5cl2\" (UID: \"97925efc-eb46-4a60-b372-b31f13a2c876\") " pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-l5cl2" Feb 17 13:43:15 crc kubenswrapper[4804]: I0217 13:43:15.988466 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88"] Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:15.999845 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.008110 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cst4d\" (UniqueName: \"kubernetes.io/projected/07b97973-fa08-4b79-9164-918a4d04f8b7-kube-api-access-cst4d\") pod \"ironic-operator-controller-manager-6494cdbf8f-cdpkr\" (UID: \"07b97973-fa08-4b79-9164-918a4d04f8b7\") " pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-cdpkr" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.008203 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-pxc28" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.008489 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.044289 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88"] Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.100925 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stpzs\" (UniqueName: \"kubernetes.io/projected/36b1ca46-becb-417e-b05e-777d40246cb6-kube-api-access-stpzs\") pod \"nova-operator-controller-manager-5ddd85db87-c8hmm\" (UID: \"36b1ca46-becb-417e-b05e-777d40246cb6\") " pod="openstack-operators/nova-operator-controller-manager-5ddd85db87-c8hmm" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.101188 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pcfl\" (UniqueName: \"kubernetes.io/projected/97925efc-eb46-4a60-b372-b31f13a2c876-kube-api-access-5pcfl\") pod \"neutron-operator-controller-manager-54967dbbdf-l5cl2\" (UID: \"97925efc-eb46-4a60-b372-b31f13a2c876\") " pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-l5cl2" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.112627 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls4zb\" (UniqueName: \"kubernetes.io/projected/79eb8fb0-6207-44c8-b3c2-a00116bcf10b-kube-api-access-ls4zb\") pod \"octavia-operator-controller-manager-745bbbd77b-ptrs5\" (UID: \"79eb8fb0-6207-44c8-b3c2-a00116bcf10b\") " pod="openstack-operators/octavia-operator-controller-manager-745bbbd77b-ptrs5" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.119473 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-57bd55f9b7-9vbg5"] Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.121860 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-cdpkr" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.139250 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-9vbg5" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.142694 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-dxk4c" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.160009 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-57bd55f9b7-9vbg5"] Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.166336 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls4zb\" (UniqueName: \"kubernetes.io/projected/79eb8fb0-6207-44c8-b3c2-a00116bcf10b-kube-api-access-ls4zb\") pod \"octavia-operator-controller-manager-745bbbd77b-ptrs5\" (UID: \"79eb8fb0-6207-44c8-b3c2-a00116bcf10b\") " pod="openstack-operators/octavia-operator-controller-manager-745bbbd77b-ptrs5" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.168929 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pcfl\" (UniqueName: \"kubernetes.io/projected/97925efc-eb46-4a60-b372-b31f13a2c876-kube-api-access-5pcfl\") pod \"neutron-operator-controller-manager-54967dbbdf-l5cl2\" (UID: \"97925efc-eb46-4a60-b372-b31f13a2c876\") " pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-l5cl2" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.196998 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stpzs\" (UniqueName: \"kubernetes.io/projected/36b1ca46-becb-417e-b05e-777d40246cb6-kube-api-access-stpzs\") pod \"nova-operator-controller-manager-5ddd85db87-c8hmm\" (UID: \"36b1ca46-becb-417e-b05e-777d40246cb6\") " pod="openstack-operators/nova-operator-controller-manager-5ddd85db87-c8hmm" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.217895 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl5dp\" (UniqueName: \"kubernetes.io/projected/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-kube-api-access-fl5dp\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88\" (UID: \"ae7598b8-fff5-4044-bbd7-0c8f2f60eed8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.218037 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88\" (UID: \"ae7598b8-fff5-4044-bbd7-0c8f2f60eed8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.218163 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-85c99d655-ltwrc"] Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.219174 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-85c99d655-ltwrc" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.222592 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-pddsh" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.238509 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-96fff9cb8-88sh4" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.247833 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-n2l94" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.279598 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-85c99d655-ltwrc"] Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.281475 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-l5cl2" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.282127 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-79558bbfbf-n6fl9"] Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.283805 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-n6fl9" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.286346 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-8467ccb4c8-nwmk5"] Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.286656 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-kwvbt" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.287272 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-nwmk5" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.288675 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-x2tmn" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.297355 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-56dc67d744-rbrxl"] Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.298162 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-56dc67d744-rbrxl"] Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.298267 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-79558bbfbf-n6fl9"] Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.298387 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-rbrxl" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.299992 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-xjgzd" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.309751 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-8467ccb4c8-nwmk5"] Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.312001 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c469bc6bb-xlwmb"] Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.314982 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c469bc6bb-xlwmb" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.315654 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5ddd85db87-c8hmm" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.316894 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c469bc6bb-xlwmb"] Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.317096 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-258zq" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.319023 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxsbs\" (UniqueName: \"kubernetes.io/projected/ac1e20c8-4527-4bba-85bd-2154e1244d3e-kube-api-access-jxsbs\") pod \"ovn-operator-controller-manager-85c99d655-ltwrc\" (UID: \"ac1e20c8-4527-4bba-85bd-2154e1244d3e\") " pod="openstack-operators/ovn-operator-controller-manager-85c99d655-ltwrc" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.319171 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert\") pod \"infra-operator-controller-manager-66d6b5f488-lrjgg\" (UID: \"bf13099a-fbab-41bf-b30c-5c6b1049af19\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.319397 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl5dp\" (UniqueName: \"kubernetes.io/projected/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-kube-api-access-fl5dp\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88\" (UID: \"ae7598b8-fff5-4044-bbd7-0c8f2f60eed8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.319450 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88\" (UID: \"ae7598b8-fff5-4044-bbd7-0c8f2f60eed8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.319485 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xglv\" (UniqueName: \"kubernetes.io/projected/42505b9c-f878-4feb-b9a1-9dfa11ec0f56-kube-api-access-7xglv\") pod \"placement-operator-controller-manager-57bd55f9b7-9vbg5\" (UID: \"42505b9c-f878-4feb-b9a1-9dfa11ec0f56\") " pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-9vbg5" Feb 17 13:43:16 crc kubenswrapper[4804]: E0217 13:43:16.319642 4804 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 13:43:16 crc kubenswrapper[4804]: E0217 13:43:16.319685 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert podName:bf13099a-fbab-41bf-b30c-5c6b1049af19 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:17.319671294 +0000 UTC m=+1071.431090621 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert") pod "infra-operator-controller-manager-66d6b5f488-lrjgg" (UID: "bf13099a-fbab-41bf-b30c-5c6b1049af19") : secret "infra-operator-webhook-server-cert" not found Feb 17 13:43:16 crc kubenswrapper[4804]: E0217 13:43:16.322285 4804 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 13:43:16 crc kubenswrapper[4804]: E0217 13:43:16.322352 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert podName:ae7598b8-fff5-4044-bbd7-0c8f2f60eed8 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:16.822341847 +0000 UTC m=+1070.933761174 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" (UID: "ae7598b8-fff5-4044-bbd7-0c8f2f60eed8") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.338313 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv"] Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.339705 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.342713 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.343065 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.343498 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-vp69j" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.346778 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-745bbbd77b-ptrs5" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.347349 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv"] Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.420566 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxsbs\" (UniqueName: \"kubernetes.io/projected/ac1e20c8-4527-4bba-85bd-2154e1244d3e-kube-api-access-jxsbs\") pod \"ovn-operator-controller-manager-85c99d655-ltwrc\" (UID: \"ac1e20c8-4527-4bba-85bd-2154e1244d3e\") " pod="openstack-operators/ovn-operator-controller-manager-85c99d655-ltwrc" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.420818 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkwhq\" (UniqueName: \"kubernetes.io/projected/f94e791f-16fd-4364-a246-35bcca0d14e6-kube-api-access-rkwhq\") pod \"swift-operator-controller-manager-79558bbfbf-n6fl9\" (UID: \"f94e791f-16fd-4364-a246-35bcca0d14e6\") " pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-n6fl9" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.421069 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbrs6\" (UniqueName: \"kubernetes.io/projected/57038414-fcca-4a2a-8756-46f97cc57d81-kube-api-access-xbrs6\") pod \"watcher-operator-controller-manager-6c469bc6bb-xlwmb\" (UID: \"57038414-fcca-4a2a-8756-46f97cc57d81\") " pod="openstack-operators/watcher-operator-controller-manager-6c469bc6bb-xlwmb" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.421184 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2mgs\" (UniqueName: \"kubernetes.io/projected/1c7ad838-6225-4001-899a-7f741cb75f2f-kube-api-access-x2mgs\") pod \"test-operator-controller-manager-8467ccb4c8-nwmk5\" (UID: \"1c7ad838-6225-4001-899a-7f741cb75f2f\") " pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-nwmk5" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.421292 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rcn4\" (UniqueName: \"kubernetes.io/projected/8155784a-3945-4ca3-aa9a-b0e089ffac52-kube-api-access-8rcn4\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.421366 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xglv\" (UniqueName: \"kubernetes.io/projected/42505b9c-f878-4feb-b9a1-9dfa11ec0f56-kube-api-access-7xglv\") pod \"placement-operator-controller-manager-57bd55f9b7-9vbg5\" (UID: \"42505b9c-f878-4feb-b9a1-9dfa11ec0f56\") " pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-9vbg5" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.421928 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7bxk\" (UniqueName: \"kubernetes.io/projected/067b67c8-64c5-4c21-b1b1-770aa68e0eb7-kube-api-access-q7bxk\") pod \"telemetry-operator-controller-manager-56dc67d744-rbrxl\" (UID: \"067b67c8-64c5-4c21-b1b1-770aa68e0eb7\") " pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-rbrxl" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.422012 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.422116 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.538848 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl5dp\" (UniqueName: \"kubernetes.io/projected/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-kube-api-access-fl5dp\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88\" (UID: \"ae7598b8-fff5-4044-bbd7-0c8f2f60eed8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.539488 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkwhq\" (UniqueName: \"kubernetes.io/projected/f94e791f-16fd-4364-a246-35bcca0d14e6-kube-api-access-rkwhq\") pod \"swift-operator-controller-manager-79558bbfbf-n6fl9\" (UID: \"f94e791f-16fd-4364-a246-35bcca0d14e6\") " pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-n6fl9" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.539539 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbrs6\" (UniqueName: \"kubernetes.io/projected/57038414-fcca-4a2a-8756-46f97cc57d81-kube-api-access-xbrs6\") pod \"watcher-operator-controller-manager-6c469bc6bb-xlwmb\" (UID: \"57038414-fcca-4a2a-8756-46f97cc57d81\") " pod="openstack-operators/watcher-operator-controller-manager-6c469bc6bb-xlwmb" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.539582 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2mgs\" (UniqueName: \"kubernetes.io/projected/1c7ad838-6225-4001-899a-7f741cb75f2f-kube-api-access-x2mgs\") pod \"test-operator-controller-manager-8467ccb4c8-nwmk5\" (UID: \"1c7ad838-6225-4001-899a-7f741cb75f2f\") " pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-nwmk5" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.539649 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rcn4\" (UniqueName: \"kubernetes.io/projected/8155784a-3945-4ca3-aa9a-b0e089ffac52-kube-api-access-8rcn4\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.539708 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7bxk\" (UniqueName: \"kubernetes.io/projected/067b67c8-64c5-4c21-b1b1-770aa68e0eb7-kube-api-access-q7bxk\") pod \"telemetry-operator-controller-manager-56dc67d744-rbrxl\" (UID: \"067b67c8-64c5-4c21-b1b1-770aa68e0eb7\") " pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-rbrxl" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.539731 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.539779 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:16 crc kubenswrapper[4804]: E0217 13:43:16.539935 4804 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 13:43:16 crc kubenswrapper[4804]: E0217 13:43:16.539994 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs podName:8155784a-3945-4ca3-aa9a-b0e089ffac52 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:17.039967808 +0000 UTC m=+1071.151387145 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs") pod "openstack-operator-controller-manager-5744df64c-mkkrv" (UID: "8155784a-3945-4ca3-aa9a-b0e089ffac52") : secret "webhook-server-cert" not found Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.541759 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxsbs\" (UniqueName: \"kubernetes.io/projected/ac1e20c8-4527-4bba-85bd-2154e1244d3e-kube-api-access-jxsbs\") pod \"ovn-operator-controller-manager-85c99d655-ltwrc\" (UID: \"ac1e20c8-4527-4bba-85bd-2154e1244d3e\") " pod="openstack-operators/ovn-operator-controller-manager-85c99d655-ltwrc" Feb 17 13:43:16 crc kubenswrapper[4804]: E0217 13:43:16.543890 4804 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 13:43:16 crc kubenswrapper[4804]: E0217 13:43:16.543939 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs podName:8155784a-3945-4ca3-aa9a-b0e089ffac52 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:17.043923322 +0000 UTC m=+1071.155342659 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs") pod "openstack-operator-controller-manager-5744df64c-mkkrv" (UID: "8155784a-3945-4ca3-aa9a-b0e089ffac52") : secret "metrics-server-cert" not found Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.557014 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-85c99d655-ltwrc" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.564276 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xglv\" (UniqueName: \"kubernetes.io/projected/42505b9c-f878-4feb-b9a1-9dfa11ec0f56-kube-api-access-7xglv\") pod \"placement-operator-controller-manager-57bd55f9b7-9vbg5\" (UID: \"42505b9c-f878-4feb-b9a1-9dfa11ec0f56\") " pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-9vbg5" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.573951 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7bxk\" (UniqueName: \"kubernetes.io/projected/067b67c8-64c5-4c21-b1b1-770aa68e0eb7-kube-api-access-q7bxk\") pod \"telemetry-operator-controller-manager-56dc67d744-rbrxl\" (UID: \"067b67c8-64c5-4c21-b1b1-770aa68e0eb7\") " pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-rbrxl" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.589279 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkwhq\" (UniqueName: \"kubernetes.io/projected/f94e791f-16fd-4364-a246-35bcca0d14e6-kube-api-access-rkwhq\") pod \"swift-operator-controller-manager-79558bbfbf-n6fl9\" (UID: \"f94e791f-16fd-4364-a246-35bcca0d14e6\") " pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-n6fl9" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.677101 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-n6fl9" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.713751 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2mgs\" (UniqueName: \"kubernetes.io/projected/1c7ad838-6225-4001-899a-7f741cb75f2f-kube-api-access-x2mgs\") pod \"test-operator-controller-manager-8467ccb4c8-nwmk5\" (UID: \"1c7ad838-6225-4001-899a-7f741cb75f2f\") " pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-nwmk5" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.724471 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rcn4\" (UniqueName: \"kubernetes.io/projected/8155784a-3945-4ca3-aa9a-b0e089ffac52-kube-api-access-8rcn4\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.725177 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbrs6\" (UniqueName: \"kubernetes.io/projected/57038414-fcca-4a2a-8756-46f97cc57d81-kube-api-access-xbrs6\") pod \"watcher-operator-controller-manager-6c469bc6bb-xlwmb\" (UID: \"57038414-fcca-4a2a-8756-46f97cc57d81\") " pod="openstack-operators/watcher-operator-controller-manager-6c469bc6bb-xlwmb" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.732478 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtlpm"] Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.733530 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtlpm" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.737477 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-sjp2t" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.750041 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtlpm"] Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.780569 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-9vbg5" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.848633 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28gfn\" (UniqueName: \"kubernetes.io/projected/44ec973d-9403-48f4-b92c-72f0bd485b0f-kube-api-access-28gfn\") pod \"rabbitmq-cluster-operator-manager-668c99d594-rtlpm\" (UID: \"44ec973d-9403-48f4-b92c-72f0bd485b0f\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtlpm" Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.848739 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88\" (UID: \"ae7598b8-fff5-4044-bbd7-0c8f2f60eed8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" Feb 17 13:43:16 crc kubenswrapper[4804]: E0217 13:43:16.848915 4804 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 13:43:16 crc kubenswrapper[4804]: E0217 13:43:16.848966 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert podName:ae7598b8-fff5-4044-bbd7-0c8f2f60eed8 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:17.848949531 +0000 UTC m=+1071.960368868 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" (UID: "ae7598b8-fff5-4044-bbd7-0c8f2f60eed8") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 13:43:16 crc kubenswrapper[4804]: I0217 13:43:16.961897 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28gfn\" (UniqueName: \"kubernetes.io/projected/44ec973d-9403-48f4-b92c-72f0bd485b0f-kube-api-access-28gfn\") pod \"rabbitmq-cluster-operator-manager-668c99d594-rtlpm\" (UID: \"44ec973d-9403-48f4-b92c-72f0bd485b0f\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtlpm" Feb 17 13:43:17 crc kubenswrapper[4804]: I0217 13:43:16.988658 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-nwmk5" Feb 17 13:43:17 crc kubenswrapper[4804]: I0217 13:43:17.005193 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28gfn\" (UniqueName: \"kubernetes.io/projected/44ec973d-9403-48f4-b92c-72f0bd485b0f-kube-api-access-28gfn\") pod \"rabbitmq-cluster-operator-manager-668c99d594-rtlpm\" (UID: \"44ec973d-9403-48f4-b92c-72f0bd485b0f\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtlpm" Feb 17 13:43:17 crc kubenswrapper[4804]: I0217 13:43:17.063180 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:17 crc kubenswrapper[4804]: I0217 13:43:17.064279 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:17 crc kubenswrapper[4804]: E0217 13:43:17.063386 4804 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 13:43:17 crc kubenswrapper[4804]: E0217 13:43:17.064465 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs podName:8155784a-3945-4ca3-aa9a-b0e089ffac52 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:18.064446245 +0000 UTC m=+1072.175865572 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs") pod "openstack-operator-controller-manager-5744df64c-mkkrv" (UID: "8155784a-3945-4ca3-aa9a-b0e089ffac52") : secret "metrics-server-cert" not found Feb 17 13:43:17 crc kubenswrapper[4804]: E0217 13:43:17.064707 4804 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 13:43:17 crc kubenswrapper[4804]: E0217 13:43:17.064790 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs podName:8155784a-3945-4ca3-aa9a-b0e089ffac52 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:18.064762695 +0000 UTC m=+1072.176182062 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs") pod "openstack-operator-controller-manager-5744df64c-mkkrv" (UID: "8155784a-3945-4ca3-aa9a-b0e089ffac52") : secret "webhook-server-cert" not found Feb 17 13:43:17 crc kubenswrapper[4804]: I0217 13:43:17.101149 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-rbrxl" Feb 17 13:43:17 crc kubenswrapper[4804]: I0217 13:43:17.130172 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c469bc6bb-xlwmb" Feb 17 13:43:17 crc kubenswrapper[4804]: I0217 13:43:17.176253 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtlpm" Feb 17 13:43:17 crc kubenswrapper[4804]: I0217 13:43:17.198610 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-57746b5ff9-wn64m"] Feb 17 13:43:17 crc kubenswrapper[4804]: I0217 13:43:17.373249 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert\") pod \"infra-operator-controller-manager-66d6b5f488-lrjgg\" (UID: \"bf13099a-fbab-41bf-b30c-5c6b1049af19\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" Feb 17 13:43:17 crc kubenswrapper[4804]: E0217 13:43:17.373693 4804 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 13:43:17 crc kubenswrapper[4804]: E0217 13:43:17.373762 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert podName:bf13099a-fbab-41bf-b30c-5c6b1049af19 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:19.373742527 +0000 UTC m=+1073.485161874 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert") pod "infra-operator-controller-manager-66d6b5f488-lrjgg" (UID: "bf13099a-fbab-41bf-b30c-5c6b1049af19") : secret "infra-operator-webhook-server-cert" not found Feb 17 13:43:17 crc kubenswrapper[4804]: I0217 13:43:17.688461 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-57746b5ff9-wn64m" event={"ID":"0b746a42-c0b4-4cb9-9352-3623669bad5a","Type":"ContainerStarted","Data":"3e8f7e3b7ab6d784584525125ba04e2b9d6d38c51cfef5895e69d5253b26732a"} Feb 17 13:43:17 crc kubenswrapper[4804]: I0217 13:43:17.944000 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88\" (UID: \"ae7598b8-fff5-4044-bbd7-0c8f2f60eed8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" Feb 17 13:43:17 crc kubenswrapper[4804]: E0217 13:43:17.944623 4804 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 13:43:17 crc kubenswrapper[4804]: E0217 13:43:17.944688 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert podName:ae7598b8-fff5-4044-bbd7-0c8f2f60eed8 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:19.944671063 +0000 UTC m=+1074.056090400 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" (UID: "ae7598b8-fff5-4044-bbd7-0c8f2f60eed8") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.068770 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-c4b7d6946-4xvfg"] Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.094567 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-55cc45767f-bslfv"] Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.110951 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-68c6d499cb-vt6zw"] Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.129351 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-66997756f6-vkdg2"] Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.133626 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6494cdbf8f-cdpkr"] Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.147454 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.147501 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:18 crc kubenswrapper[4804]: E0217 13:43:18.147660 4804 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 13:43:18 crc kubenswrapper[4804]: E0217 13:43:18.147789 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs podName:8155784a-3945-4ca3-aa9a-b0e089ffac52 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:20.147769611 +0000 UTC m=+1074.259188948 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs") pod "openstack-operator-controller-manager-5744df64c-mkkrv" (UID: "8155784a-3945-4ca3-aa9a-b0e089ffac52") : secret "metrics-server-cert" not found Feb 17 13:43:18 crc kubenswrapper[4804]: E0217 13:43:18.147709 4804 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 13:43:18 crc kubenswrapper[4804]: E0217 13:43:18.147953 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs podName:8155784a-3945-4ca3-aa9a-b0e089ffac52 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:20.147937626 +0000 UTC m=+1074.259356963 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs") pod "openstack-operator-controller-manager-5744df64c-mkkrv" (UID: "8155784a-3945-4ca3-aa9a-b0e089ffac52") : secret "webhook-server-cert" not found Feb 17 13:43:18 crc kubenswrapper[4804]: W0217 13:43:18.155020 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod545c7d25_7774_4c62_89b8_f491fd4065e8.slice/crio-217dd5722838d18dc0b92ebff9346e9ccd105e8323acab78d6781707e0e1d62a WatchSource:0}: Error finding container 217dd5722838d18dc0b92ebff9346e9ccd105e8323acab78d6781707e0e1d62a: Status 404 returned error can't find the container with id 217dd5722838d18dc0b92ebff9346e9ccd105e8323acab78d6781707e0e1d62a Feb 17 13:43:18 crc kubenswrapper[4804]: W0217 13:43:18.177454 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07b97973_fa08_4b79_9164_918a4d04f8b7.slice/crio-0005e0fd91a1178d97f1bd1ad095edb4582b88a8045bf48a3fbcd86eaee62698 WatchSource:0}: Error finding container 0005e0fd91a1178d97f1bd1ad095edb4582b88a8045bf48a3fbcd86eaee62698: Status 404 returned error can't find the container with id 0005e0fd91a1178d97f1bd1ad095edb4582b88a8045bf48a3fbcd86eaee62698 Feb 17 13:43:18 crc kubenswrapper[4804]: W0217 13:43:18.178847 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2546387a_6a42_4f8d_a321_2f9cbaa11adb.slice/crio-51fb5ee19c31768668529187fabe1b7f060f4b662d5391b706263d9825cc905b WatchSource:0}: Error finding container 51fb5ee19c31768668529187fabe1b7f060f4b662d5391b706263d9825cc905b: Status 404 returned error can't find the container with id 51fb5ee19c31768668529187fabe1b7f060f4b662d5391b706263d9825cc905b Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.621879 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54967dbbdf-l5cl2"] Feb 17 13:43:18 crc kubenswrapper[4804]: W0217 13:43:18.622806 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36b1ca46_becb_417e_b05e_777d40246cb6.slice/crio-1648019a49a60d8da902c5bf2f03966ca9bb5a880fcd7309895d51d5affdfd80 WatchSource:0}: Error finding container 1648019a49a60d8da902c5bf2f03966ca9bb5a880fcd7309895d51d5affdfd80: Status 404 returned error can't find the container with id 1648019a49a60d8da902c5bf2f03966ca9bb5a880fcd7309895d51d5affdfd80 Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.624894 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5ddd85db87-c8hmm"] Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.636689 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-745bbbd77b-ptrs5"] Feb 17 13:43:18 crc kubenswrapper[4804]: W0217 13:43:18.638372 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5727ae12_4720_4470_b5cc_8b8ae81c2af7.slice/crio-2e899f188f003d852e20990736276677358654887e92e6e0ccde9e969f2e64a1 WatchSource:0}: Error finding container 2e899f188f003d852e20990736276677358654887e92e6e0ccde9e969f2e64a1: Status 404 returned error can't find the container with id 2e899f188f003d852e20990736276677358654887e92e6e0ccde9e969f2e64a1 Feb 17 13:43:18 crc kubenswrapper[4804]: W0217 13:43:18.647914 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fa66dc5_a518_40dd_a4b5_dd2b34425ad5.slice/crio-6b86fad254a452e59d45d3b94a73064c4ff9d31ed6074284b6abf2be74051c78 WatchSource:0}: Error finding container 6b86fad254a452e59d45d3b94a73064c4ff9d31ed6074284b6abf2be74051c78: Status 404 returned error can't find the container with id 6b86fad254a452e59d45d3b94a73064c4ff9d31ed6074284b6abf2be74051c78 Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.648314 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-9595d6797-sxtr2"] Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.659531 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-96fff9cb8-88sh4"] Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.663349 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-85c99d655-ltwrc"] Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.674543 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-56dc67d744-rbrxl"] Feb 17 13:43:18 crc kubenswrapper[4804]: E0217 13:43:18.680188 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:4b10e23983c3ec518c35aeabb33ac228063e56c81b4d7a100c5d91139ad7d7fc,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q7bxk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-56dc67d744-rbrxl_openstack-operators(067b67c8-64c5-4c21-b1b1-770aa68e0eb7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 17 13:43:18 crc kubenswrapper[4804]: E0217 13:43:18.681720 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-rbrxl" podUID="067b67c8-64c5-4c21-b1b1-770aa68e0eb7" Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.687999 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-54fb488b88-t6hlr"] Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.699269 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-66997756f6-vkdg2" event={"ID":"2546387a-6a42-4f8d-a321-2f9cbaa11adb","Type":"ContainerStarted","Data":"51fb5ee19c31768668529187fabe1b7f060f4b662d5391b706263d9825cc905b"} Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.701668 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-96fff9cb8-88sh4" event={"ID":"d3332002-6930-418f-8288-e8344be70c6a","Type":"ContainerStarted","Data":"2815fe792194516ef3d3a5f1b6c97356f068723b239cf94254dbb30284d13940"} Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.703762 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-cdpkr" event={"ID":"07b97973-fa08-4b79-9164-918a4d04f8b7","Type":"ContainerStarted","Data":"0005e0fd91a1178d97f1bd1ad095edb4582b88a8045bf48a3fbcd86eaee62698"} Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.717672 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-4xvfg" event={"ID":"545c7d25-7774-4c62-89b8-f491fd4065e8","Type":"ContainerStarted","Data":"217dd5722838d18dc0b92ebff9346e9ccd105e8323acab78d6781707e0e1d62a"} Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.721684 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-68c6d499cb-vt6zw" event={"ID":"5796dc62-bd84-48b7-9c4c-7d5bf1f7e984","Type":"ContainerStarted","Data":"81cdce91c558f3e8224d23b0239cfe80229f03f0d36b9ddf2558e90ffb069bbc"} Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.723371 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-l5cl2" event={"ID":"97925efc-eb46-4a60-b372-b31f13a2c876","Type":"ContainerStarted","Data":"27c9f34a05463bd2778237b8fd6f4dda2cee45edbef9f7b67c5cccf42c7bbe21"} Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.725085 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-t6hlr" event={"ID":"5fa66dc5-a518-40dd-a4b5-dd2b34425ad5","Type":"ContainerStarted","Data":"6b86fad254a452e59d45d3b94a73064c4ff9d31ed6074284b6abf2be74051c78"} Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.726302 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-rbrxl" event={"ID":"067b67c8-64c5-4c21-b1b1-770aa68e0eb7","Type":"ContainerStarted","Data":"2ab8f06a4e2b6492a89d17a3df09697e6ee8018f5875feccae6f9451e8581f49"} Feb 17 13:43:18 crc kubenswrapper[4804]: E0217 13:43:18.731115 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:4b10e23983c3ec518c35aeabb33ac228063e56c81b4d7a100c5d91139ad7d7fc\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-rbrxl" podUID="067b67c8-64c5-4c21-b1b1-770aa68e0eb7" Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.732869 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-85c99d655-ltwrc" event={"ID":"ac1e20c8-4527-4bba-85bd-2154e1244d3e","Type":"ContainerStarted","Data":"2d9c0cbe835a0154cab35f0c29db27a3bdf2eb7f1561992745e2ba9de8a5ee03"} Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.735470 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-745bbbd77b-ptrs5" event={"ID":"79eb8fb0-6207-44c8-b3c2-a00116bcf10b","Type":"ContainerStarted","Data":"6683a351b12ca73497e3085ad06ce185a775b805f769957533ba381f18bcd2c4"} Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.736555 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5ddd85db87-c8hmm" event={"ID":"36b1ca46-becb-417e-b05e-777d40246cb6","Type":"ContainerStarted","Data":"1648019a49a60d8da902c5bf2f03966ca9bb5a880fcd7309895d51d5affdfd80"} Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.739319 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-55cc45767f-bslfv" event={"ID":"fbc5e6cd-47c6-4199-a0f2-e4292a836fac","Type":"ContainerStarted","Data":"e15006572fa3cb4d1a14e535ca276b22aad783c02dd25c35ee4843b763bc9e7d"} Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.750958 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-9595d6797-sxtr2" event={"ID":"5727ae12-4720-4470-b5cc-8b8ae81c2af7","Type":"ContainerStarted","Data":"2e899f188f003d852e20990736276677358654887e92e6e0ccde9e969f2e64a1"} Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.798039 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-79558bbfbf-n6fl9"] Feb 17 13:43:18 crc kubenswrapper[4804]: W0217 13:43:18.820209 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57038414_fcca_4a2a_8756_46f97cc57d81.slice/crio-e360f63e8932c95b8f6843bc5d8e4cce9e05e8bd0bbcf4cc347a7ba22f7318b4 WatchSource:0}: Error finding container e360f63e8932c95b8f6843bc5d8e4cce9e05e8bd0bbcf4cc347a7ba22f7318b4: Status 404 returned error can't find the container with id e360f63e8932c95b8f6843bc5d8e4cce9e05e8bd0bbcf4cc347a7ba22f7318b4 Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.824746 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-8467ccb4c8-nwmk5"] Feb 17 13:43:18 crc kubenswrapper[4804]: E0217 13:43:18.832910 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f9b2e00617c7f219932ea0d5e2bb795cc4361a335a72743077948d8108695c27,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x2mgs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-8467ccb4c8-nwmk5_openstack-operators(1c7ad838-6225-4001-899a-7f741cb75f2f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.833757 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-57bd55f9b7-9vbg5"] Feb 17 13:43:18 crc kubenswrapper[4804]: E0217 13:43:18.834335 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-nwmk5" podUID="1c7ad838-6225-4001-899a-7f741cb75f2f" Feb 17 13:43:18 crc kubenswrapper[4804]: W0217 13:43:18.835465 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf94e791f_16fd_4364_a246_35bcca0d14e6.slice/crio-eb3564ae49e233544de0c48594532633248245ef6c8c47f3b8587ce82df00605 WatchSource:0}: Error finding container eb3564ae49e233544de0c48594532633248245ef6c8c47f3b8587ce82df00605: Status 404 returned error can't find the container with id eb3564ae49e233544de0c48594532633248245ef6c8c47f3b8587ce82df00605 Feb 17 13:43:18 crc kubenswrapper[4804]: W0217 13:43:18.835897 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42505b9c_f878_4feb_b9a1_9dfa11ec0f56.slice/crio-ab09d2a9c2aeba5acf347593b3d1b716bbafae068c0201dda744928e0ae0b090 WatchSource:0}: Error finding container ab09d2a9c2aeba5acf347593b3d1b716bbafae068c0201dda744928e0ae0b090: Status 404 returned error can't find the container with id ab09d2a9c2aeba5acf347593b3d1b716bbafae068c0201dda744928e0ae0b090 Feb 17 13:43:18 crc kubenswrapper[4804]: E0217 13:43:18.840076 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:015f7f2d8b5afc85e51dd3b2e02a4cfb8294b543437315b291006d2416764db9,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rkwhq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-79558bbfbf-n6fl9_openstack-operators(f94e791f-16fd-4364-a246-35bcca0d14e6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 17 13:43:18 crc kubenswrapper[4804]: E0217 13:43:18.840538 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d800f1288d1517d84a45ddd475c3c0b4e8686fd900c9edf1e20b662b15218b89,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7xglv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-57bd55f9b7-9vbg5_openstack-operators(42505b9c-f878-4feb-b9a1-9dfa11ec0f56): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 17 13:43:18 crc kubenswrapper[4804]: E0217 13:43:18.841462 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-n6fl9" podUID="f94e791f-16fd-4364-a246-35bcca0d14e6" Feb 17 13:43:18 crc kubenswrapper[4804]: E0217 13:43:18.841975 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-9vbg5" podUID="42505b9c-f878-4feb-b9a1-9dfa11ec0f56" Feb 17 13:43:18 crc kubenswrapper[4804]: W0217 13:43:18.846186 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44ec973d_9403_48f4_b92c_72f0bd485b0f.slice/crio-4f04aa0e3d9c6e94cbb398e41a39e0969ccd1b6dd865984521f2b7ecdfdbd542 WatchSource:0}: Error finding container 4f04aa0e3d9c6e94cbb398e41a39e0969ccd1b6dd865984521f2b7ecdfdbd542: Status 404 returned error can't find the container with id 4f04aa0e3d9c6e94cbb398e41a39e0969ccd1b6dd865984521f2b7ecdfdbd542 Feb 17 13:43:18 crc kubenswrapper[4804]: W0217 13:43:18.850925 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod430279ab_ba2f_4838_ab07_b851d4df84a0.slice/crio-83263f215757a36f33b43ea13540e7c6e2ff9a55d4d0372fdc155b4f2a47cfa6 WatchSource:0}: Error finding container 83263f215757a36f33b43ea13540e7c6e2ff9a55d4d0372fdc155b4f2a47cfa6: Status 404 returned error can't find the container with id 83263f215757a36f33b43ea13540e7c6e2ff9a55d4d0372fdc155b4f2a47cfa6 Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.852285 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtlpm"] Feb 17 13:43:18 crc kubenswrapper[4804]: E0217 13:43:18.858215 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-28gfn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-rtlpm_openstack-operators(44ec973d-9403-48f4-b92c-72f0bd485b0f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.858690 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c469bc6bb-xlwmb"] Feb 17 13:43:18 crc kubenswrapper[4804]: E0217 13:43:18.859398 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtlpm" podUID="44ec973d-9403-48f4-b92c-72f0bd485b0f" Feb 17 13:43:18 crc kubenswrapper[4804]: E0217 13:43:18.859763 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:9cb0b42ba1836ba4320a0a4660bfdeddea8c0685be379c0000dafb16398f4469,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v8xc8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-6c78d668d5-pddsh_openstack-operators(430279ab-ba2f-4838-ab07-b851d4df84a0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 17 13:43:18 crc kubenswrapper[4804]: E0217 13:43:18.860973 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-pddsh" podUID="430279ab-ba2f-4838-ab07-b851d4df84a0" Feb 17 13:43:18 crc kubenswrapper[4804]: I0217 13:43:18.865709 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6c78d668d5-pddsh"] Feb 17 13:43:19 crc kubenswrapper[4804]: I0217 13:43:19.469371 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert\") pod \"infra-operator-controller-manager-66d6b5f488-lrjgg\" (UID: \"bf13099a-fbab-41bf-b30c-5c6b1049af19\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" Feb 17 13:43:19 crc kubenswrapper[4804]: E0217 13:43:19.469567 4804 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 13:43:19 crc kubenswrapper[4804]: E0217 13:43:19.469682 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert podName:bf13099a-fbab-41bf-b30c-5c6b1049af19 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:23.469651302 +0000 UTC m=+1077.581070679 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert") pod "infra-operator-controller-manager-66d6b5f488-lrjgg" (UID: "bf13099a-fbab-41bf-b30c-5c6b1049af19") : secret "infra-operator-webhook-server-cert" not found Feb 17 13:43:19 crc kubenswrapper[4804]: I0217 13:43:19.766502 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtlpm" event={"ID":"44ec973d-9403-48f4-b92c-72f0bd485b0f","Type":"ContainerStarted","Data":"4f04aa0e3d9c6e94cbb398e41a39e0969ccd1b6dd865984521f2b7ecdfdbd542"} Feb 17 13:43:19 crc kubenswrapper[4804]: I0217 13:43:19.767635 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c469bc6bb-xlwmb" event={"ID":"57038414-fcca-4a2a-8756-46f97cc57d81","Type":"ContainerStarted","Data":"e360f63e8932c95b8f6843bc5d8e4cce9e05e8bd0bbcf4cc347a7ba22f7318b4"} Feb 17 13:43:19 crc kubenswrapper[4804]: E0217 13:43:19.768881 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtlpm" podUID="44ec973d-9403-48f4-b92c-72f0bd485b0f" Feb 17 13:43:19 crc kubenswrapper[4804]: I0217 13:43:19.769178 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-nwmk5" event={"ID":"1c7ad838-6225-4001-899a-7f741cb75f2f","Type":"ContainerStarted","Data":"95a722dfb74ea63952d9a887ace6ed84417ff6c0149e537bbb9344073c2146a8"} Feb 17 13:43:19 crc kubenswrapper[4804]: I0217 13:43:19.770385 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-9vbg5" event={"ID":"42505b9c-f878-4feb-b9a1-9dfa11ec0f56","Type":"ContainerStarted","Data":"ab09d2a9c2aeba5acf347593b3d1b716bbafae068c0201dda744928e0ae0b090"} Feb 17 13:43:19 crc kubenswrapper[4804]: E0217 13:43:19.772453 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d800f1288d1517d84a45ddd475c3c0b4e8686fd900c9edf1e20b662b15218b89\\\"\"" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-9vbg5" podUID="42505b9c-f878-4feb-b9a1-9dfa11ec0f56" Feb 17 13:43:19 crc kubenswrapper[4804]: E0217 13:43:19.772489 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f9b2e00617c7f219932ea0d5e2bb795cc4361a335a72743077948d8108695c27\\\"\"" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-nwmk5" podUID="1c7ad838-6225-4001-899a-7f741cb75f2f" Feb 17 13:43:19 crc kubenswrapper[4804]: I0217 13:43:19.772865 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-pddsh" event={"ID":"430279ab-ba2f-4838-ab07-b851d4df84a0","Type":"ContainerStarted","Data":"83263f215757a36f33b43ea13540e7c6e2ff9a55d4d0372fdc155b4f2a47cfa6"} Feb 17 13:43:19 crc kubenswrapper[4804]: E0217 13:43:19.774178 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:9cb0b42ba1836ba4320a0a4660bfdeddea8c0685be379c0000dafb16398f4469\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-pddsh" podUID="430279ab-ba2f-4838-ab07-b851d4df84a0" Feb 17 13:43:19 crc kubenswrapper[4804]: I0217 13:43:19.776834 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-n6fl9" event={"ID":"f94e791f-16fd-4364-a246-35bcca0d14e6","Type":"ContainerStarted","Data":"eb3564ae49e233544de0c48594532633248245ef6c8c47f3b8587ce82df00605"} Feb 17 13:43:19 crc kubenswrapper[4804]: E0217 13:43:19.779751 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:4b10e23983c3ec518c35aeabb33ac228063e56c81b4d7a100c5d91139ad7d7fc\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-rbrxl" podUID="067b67c8-64c5-4c21-b1b1-770aa68e0eb7" Feb 17 13:43:19 crc kubenswrapper[4804]: E0217 13:43:19.779780 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:015f7f2d8b5afc85e51dd3b2e02a4cfb8294b543437315b291006d2416764db9\\\"\"" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-n6fl9" podUID="f94e791f-16fd-4364-a246-35bcca0d14e6" Feb 17 13:43:19 crc kubenswrapper[4804]: I0217 13:43:19.976092 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88\" (UID: \"ae7598b8-fff5-4044-bbd7-0c8f2f60eed8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" Feb 17 13:43:19 crc kubenswrapper[4804]: E0217 13:43:19.976267 4804 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 13:43:19 crc kubenswrapper[4804]: E0217 13:43:19.976335 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert podName:ae7598b8-fff5-4044-bbd7-0c8f2f60eed8 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:23.976316962 +0000 UTC m=+1078.087736299 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" (UID: "ae7598b8-fff5-4044-bbd7-0c8f2f60eed8") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 13:43:20 crc kubenswrapper[4804]: I0217 13:43:20.178481 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:20 crc kubenswrapper[4804]: I0217 13:43:20.178554 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:20 crc kubenswrapper[4804]: E0217 13:43:20.178735 4804 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 13:43:20 crc kubenswrapper[4804]: E0217 13:43:20.178779 4804 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 13:43:20 crc kubenswrapper[4804]: E0217 13:43:20.178810 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs podName:8155784a-3945-4ca3-aa9a-b0e089ffac52 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:24.17879087 +0000 UTC m=+1078.290210207 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs") pod "openstack-operator-controller-manager-5744df64c-mkkrv" (UID: "8155784a-3945-4ca3-aa9a-b0e089ffac52") : secret "metrics-server-cert" not found Feb 17 13:43:20 crc kubenswrapper[4804]: E0217 13:43:20.178864 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs podName:8155784a-3945-4ca3-aa9a-b0e089ffac52 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:24.178843772 +0000 UTC m=+1078.290263109 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs") pod "openstack-operator-controller-manager-5744df64c-mkkrv" (UID: "8155784a-3945-4ca3-aa9a-b0e089ffac52") : secret "webhook-server-cert" not found Feb 17 13:43:20 crc kubenswrapper[4804]: E0217 13:43:20.781995 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f9b2e00617c7f219932ea0d5e2bb795cc4361a335a72743077948d8108695c27\\\"\"" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-nwmk5" podUID="1c7ad838-6225-4001-899a-7f741cb75f2f" Feb 17 13:43:20 crc kubenswrapper[4804]: E0217 13:43:20.783499 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:015f7f2d8b5afc85e51dd3b2e02a4cfb8294b543437315b291006d2416764db9\\\"\"" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-n6fl9" podUID="f94e791f-16fd-4364-a246-35bcca0d14e6" Feb 17 13:43:20 crc kubenswrapper[4804]: E0217 13:43:20.783509 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtlpm" podUID="44ec973d-9403-48f4-b92c-72f0bd485b0f" Feb 17 13:43:20 crc kubenswrapper[4804]: E0217 13:43:20.783555 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d800f1288d1517d84a45ddd475c3c0b4e8686fd900c9edf1e20b662b15218b89\\\"\"" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-9vbg5" podUID="42505b9c-f878-4feb-b9a1-9dfa11ec0f56" Feb 17 13:43:20 crc kubenswrapper[4804]: E0217 13:43:20.783558 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:9cb0b42ba1836ba4320a0a4660bfdeddea8c0685be379c0000dafb16398f4469\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-pddsh" podUID="430279ab-ba2f-4838-ab07-b851d4df84a0" Feb 17 13:43:23 crc kubenswrapper[4804]: I0217 13:43:23.520956 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert\") pod \"infra-operator-controller-manager-66d6b5f488-lrjgg\" (UID: \"bf13099a-fbab-41bf-b30c-5c6b1049af19\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" Feb 17 13:43:23 crc kubenswrapper[4804]: E0217 13:43:23.521171 4804 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 13:43:23 crc kubenswrapper[4804]: E0217 13:43:23.521390 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert podName:bf13099a-fbab-41bf-b30c-5c6b1049af19 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:31.521372469 +0000 UTC m=+1085.632791806 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert") pod "infra-operator-controller-manager-66d6b5f488-lrjgg" (UID: "bf13099a-fbab-41bf-b30c-5c6b1049af19") : secret "infra-operator-webhook-server-cert" not found Feb 17 13:43:24 crc kubenswrapper[4804]: I0217 13:43:24.035463 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88\" (UID: \"ae7598b8-fff5-4044-bbd7-0c8f2f60eed8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" Feb 17 13:43:24 crc kubenswrapper[4804]: E0217 13:43:24.035708 4804 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 13:43:24 crc kubenswrapper[4804]: E0217 13:43:24.035765 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert podName:ae7598b8-fff5-4044-bbd7-0c8f2f60eed8 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:32.03574517 +0000 UTC m=+1086.147164517 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" (UID: "ae7598b8-fff5-4044-bbd7-0c8f2f60eed8") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 13:43:24 crc kubenswrapper[4804]: I0217 13:43:24.239245 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:24 crc kubenswrapper[4804]: E0217 13:43:24.239402 4804 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 13:43:24 crc kubenswrapper[4804]: I0217 13:43:24.239426 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:24 crc kubenswrapper[4804]: E0217 13:43:24.239462 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs podName:8155784a-3945-4ca3-aa9a-b0e089ffac52 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:32.239445597 +0000 UTC m=+1086.350864934 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs") pod "openstack-operator-controller-manager-5744df64c-mkkrv" (UID: "8155784a-3945-4ca3-aa9a-b0e089ffac52") : secret "webhook-server-cert" not found Feb 17 13:43:24 crc kubenswrapper[4804]: E0217 13:43:24.239528 4804 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 13:43:24 crc kubenswrapper[4804]: E0217 13:43:24.239562 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs podName:8155784a-3945-4ca3-aa9a-b0e089ffac52 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:32.23955047 +0000 UTC m=+1086.350969807 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs") pod "openstack-operator-controller-manager-5744df64c-mkkrv" (UID: "8155784a-3945-4ca3-aa9a-b0e089ffac52") : secret "metrics-server-cert" not found Feb 17 13:43:31 crc kubenswrapper[4804]: I0217 13:43:31.576171 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert\") pod \"infra-operator-controller-manager-66d6b5f488-lrjgg\" (UID: \"bf13099a-fbab-41bf-b30c-5c6b1049af19\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" Feb 17 13:43:31 crc kubenswrapper[4804]: E0217 13:43:31.576559 4804 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 13:43:31 crc kubenswrapper[4804]: E0217 13:43:31.577443 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert podName:bf13099a-fbab-41bf-b30c-5c6b1049af19 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:47.577420323 +0000 UTC m=+1101.688839660 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert") pod "infra-operator-controller-manager-66d6b5f488-lrjgg" (UID: "bf13099a-fbab-41bf-b30c-5c6b1049af19") : secret "infra-operator-webhook-server-cert" not found Feb 17 13:43:32 crc kubenswrapper[4804]: I0217 13:43:32.085458 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88\" (UID: \"ae7598b8-fff5-4044-bbd7-0c8f2f60eed8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" Feb 17 13:43:32 crc kubenswrapper[4804]: E0217 13:43:32.085639 4804 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 13:43:32 crc kubenswrapper[4804]: E0217 13:43:32.085685 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert podName:ae7598b8-fff5-4044-bbd7-0c8f2f60eed8 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:48.085672393 +0000 UTC m=+1102.197091730 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" (UID: "ae7598b8-fff5-4044-bbd7-0c8f2f60eed8") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 13:43:32 crc kubenswrapper[4804]: I0217 13:43:32.288939 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:32 crc kubenswrapper[4804]: E0217 13:43:32.289139 4804 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 13:43:32 crc kubenswrapper[4804]: E0217 13:43:32.289486 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs podName:8155784a-3945-4ca3-aa9a-b0e089ffac52 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:48.289459504 +0000 UTC m=+1102.400878891 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs") pod "openstack-operator-controller-manager-5744df64c-mkkrv" (UID: "8155784a-3945-4ca3-aa9a-b0e089ffac52") : secret "webhook-server-cert" not found Feb 17 13:43:32 crc kubenswrapper[4804]: I0217 13:43:32.289654 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:32 crc kubenswrapper[4804]: E0217 13:43:32.289786 4804 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 13:43:32 crc kubenswrapper[4804]: E0217 13:43:32.289842 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs podName:8155784a-3945-4ca3-aa9a-b0e089ffac52 nodeName:}" failed. No retries permitted until 2026-02-17 13:43:48.289831164 +0000 UTC m=+1102.401250561 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs") pod "openstack-operator-controller-manager-5744df64c-mkkrv" (UID: "8155784a-3945-4ca3-aa9a-b0e089ffac52") : secret "metrics-server-cert" not found Feb 17 13:43:34 crc kubenswrapper[4804]: E0217 13:43:34.208581 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:8d65a2becf279bb8b6b1a09e273d9a2cb1ff41f85bc42ef2e4d573cbb8cbac89" Feb 17 13:43:34 crc kubenswrapper[4804]: E0217 13:43:34.209095 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:8d65a2becf279bb8b6b1a09e273d9a2cb1ff41f85bc42ef2e4d573cbb8cbac89,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5pcfl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-54967dbbdf-l5cl2_openstack-operators(97925efc-eb46-4a60-b372-b31f13a2c876): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 13:43:34 crc kubenswrapper[4804]: E0217 13:43:34.210289 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-l5cl2" podUID="97925efc-eb46-4a60-b372-b31f13a2c876" Feb 17 13:43:34 crc kubenswrapper[4804]: E0217 13:43:34.483247 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:8d65a2becf279bb8b6b1a09e273d9a2cb1ff41f85bc42ef2e4d573cbb8cbac89\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-l5cl2" podUID="97925efc-eb46-4a60-b372-b31f13a2c876" Feb 17 13:43:36 crc kubenswrapper[4804]: E0217 13:43:36.185613 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:3cba74378b21d22a9081b69a7547667220f090ae9281b2eabea35f91dfcf56c6" Feb 17 13:43:36 crc kubenswrapper[4804]: E0217 13:43:36.186060 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:3cba74378b21d22a9081b69a7547667220f090ae9281b2eabea35f91dfcf56c6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ls4zb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-745bbbd77b-ptrs5_openstack-operators(79eb8fb0-6207-44c8-b3c2-a00116bcf10b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 13:43:36 crc kubenswrapper[4804]: E0217 13:43:36.187346 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-745bbbd77b-ptrs5" podUID="79eb8fb0-6207-44c8-b3c2-a00116bcf10b" Feb 17 13:43:36 crc kubenswrapper[4804]: E0217 13:43:36.511619 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:3cba74378b21d22a9081b69a7547667220f090ae9281b2eabea35f91dfcf56c6\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-745bbbd77b-ptrs5" podUID="79eb8fb0-6207-44c8-b3c2-a00116bcf10b" Feb 17 13:43:36 crc kubenswrapper[4804]: E0217 13:43:36.826676 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:ab8e8207abec9cf5da7afded75ea76d1c3d2b9ab0f8e3124f518651e38f3123c" Feb 17 13:43:36 crc kubenswrapper[4804]: E0217 13:43:36.826878 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:ab8e8207abec9cf5da7afded75ea76d1c3d2b9ab0f8e3124f518651e38f3123c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-stpzs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5ddd85db87-c8hmm_openstack-operators(36b1ca46-becb-417e-b05e-777d40246cb6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 13:43:36 crc kubenswrapper[4804]: E0217 13:43:36.828040 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-5ddd85db87-c8hmm" podUID="36b1ca46-becb-417e-b05e-777d40246cb6" Feb 17 13:43:37 crc kubenswrapper[4804]: E0217 13:43:37.517850 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:ab8e8207abec9cf5da7afded75ea76d1c3d2b9ab0f8e3124f518651e38f3123c\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5ddd85db87-c8hmm" podUID="36b1ca46-becb-417e-b05e-777d40246cb6" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.568007 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-t6hlr" event={"ID":"5fa66dc5-a518-40dd-a4b5-dd2b34425ad5","Type":"ContainerStarted","Data":"5c89006876b8b1e9d3045d1c79368457d87477adb1bc57a36422c5e8a712da20"} Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.568653 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-t6hlr" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.570190 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-nwmk5" event={"ID":"1c7ad838-6225-4001-899a-7f741cb75f2f","Type":"ContainerStarted","Data":"3db8948ca335309c6c47d66c1a2e2d6297e85453fdbb017e59ccad0719040c41"} Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.570428 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-nwmk5" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.572276 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-68c6d499cb-vt6zw" event={"ID":"5796dc62-bd84-48b7-9c4c-7d5bf1f7e984","Type":"ContainerStarted","Data":"54adeb351a52486e17abe0f9b50cf8c806b3e1c6423d9918147e06e66c6219e0"} Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.572444 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-68c6d499cb-vt6zw" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.574053 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtlpm" event={"ID":"44ec973d-9403-48f4-b92c-72f0bd485b0f","Type":"ContainerStarted","Data":"76d1816c2540e7c49284c92d90af3e09d0423c04240e37c1b1e42048036c6ca7"} Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.575833 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-96fff9cb8-88sh4" event={"ID":"d3332002-6930-418f-8288-e8344be70c6a","Type":"ContainerStarted","Data":"7ab13d28c53db3b199a756c0355824c891323d454fc2a7fdf2a512223a96a156"} Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.575978 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-96fff9cb8-88sh4" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.578007 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-cdpkr" event={"ID":"07b97973-fa08-4b79-9164-918a4d04f8b7","Type":"ContainerStarted","Data":"3585c2a3514981cbd4f883a63b8bc41064406ea99f4aa4056357d4066270222b"} Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.578289 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-cdpkr" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.583474 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-9595d6797-sxtr2" event={"ID":"5727ae12-4720-4470-b5cc-8b8ae81c2af7","Type":"ContainerStarted","Data":"74a6fd861a620cc6fb415cb407d6c1b308efc0162edb78887a75a978ed5fe3df"} Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.583635 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-9595d6797-sxtr2" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.587780 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-85c99d655-ltwrc" event={"ID":"ac1e20c8-4527-4bba-85bd-2154e1244d3e","Type":"ContainerStarted","Data":"615769a1a7ba0a6e6a4183426261541178b669c0e5417273476ef73018dc5a3c"} Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.588609 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-85c99d655-ltwrc" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.600010 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-pddsh" event={"ID":"430279ab-ba2f-4838-ab07-b851d4df84a0","Type":"ContainerStarted","Data":"1ac0307065a2b27a33baeb0e24a8999550a23391cd0ea6bfe01459914f545ad0"} Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.600367 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-pddsh" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.607536 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-n6fl9" event={"ID":"f94e791f-16fd-4364-a246-35bcca0d14e6","Type":"ContainerStarted","Data":"1b75b7dae645d7cac6f38834036ab3236222319a525f659f325cad59a4c98e22"} Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.608123 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-n6fl9" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.610559 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-66997756f6-vkdg2" event={"ID":"2546387a-6a42-4f8d-a321-2f9cbaa11adb","Type":"ContainerStarted","Data":"9e4be95bc3db4500b2c4afa726bc78655b58ec83c050ed3fb8500a036379b72f"} Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.610715 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-66997756f6-vkdg2" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.612305 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-55cc45767f-bslfv" event={"ID":"fbc5e6cd-47c6-4199-a0f2-e4292a836fac","Type":"ContainerStarted","Data":"fa84bc9e07f4c4c884d0faf47857f1d82f0e0961c9ea342ff85a45e1df093286"} Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.612391 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-55cc45767f-bslfv" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.614008 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c469bc6bb-xlwmb" event={"ID":"57038414-fcca-4a2a-8756-46f97cc57d81","Type":"ContainerStarted","Data":"8f812716c35bb910bae6c3fc982b3867441f782036702db3c2c5967423b131f6"} Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.614055 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c469bc6bb-xlwmb" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.614618 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-t6hlr" podStartSLOduration=7.969450584 podStartE2EDuration="28.614599262s" podCreationTimestamp="2026-02-17 13:43:15 +0000 UTC" firstStartedPulling="2026-02-17 13:43:18.657353013 +0000 UTC m=+1072.768772350" lastFinishedPulling="2026-02-17 13:43:39.302501691 +0000 UTC m=+1093.413921028" observedRunningTime="2026-02-17 13:43:43.609598817 +0000 UTC m=+1097.721018164" watchObservedRunningTime="2026-02-17 13:43:43.614599262 +0000 UTC m=+1097.726018599" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.621119 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-rbrxl" event={"ID":"067b67c8-64c5-4c21-b1b1-770aa68e0eb7","Type":"ContainerStarted","Data":"4ad19de3ea4a63c8f85ec290f0edc9ffd4b5584dd043a4158cb9146bf539364d"} Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.621366 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-rbrxl" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.625480 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-4xvfg" event={"ID":"545c7d25-7774-4c62-89b8-f491fd4065e8","Type":"ContainerStarted","Data":"54cd6a2d300e8d178521467d3e580af1070b9d46523c6c62e41113e2789e0f9e"} Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.625736 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-4xvfg" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.627434 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-57746b5ff9-wn64m" event={"ID":"0b746a42-c0b4-4cb9-9352-3623669bad5a","Type":"ContainerStarted","Data":"8df296497c51966ca58ffd39a4d292c89c9e09aa6f855a8ac799b7b96bf36135"} Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.627589 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-57746b5ff9-wn64m" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.640317 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-9vbg5" event={"ID":"42505b9c-f878-4feb-b9a1-9dfa11ec0f56","Type":"ContainerStarted","Data":"988834749e5e8d25edd1dd777c56e64197708c6cdaa8c5720b0d7591ba0b80f2"} Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.641542 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-9vbg5" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.651029 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-85c99d655-ltwrc" podStartSLOduration=7.087970768 podStartE2EDuration="28.651011729s" podCreationTimestamp="2026-02-17 13:43:15 +0000 UTC" firstStartedPulling="2026-02-17 13:43:18.669596185 +0000 UTC m=+1072.781015522" lastFinishedPulling="2026-02-17 13:43:40.232637146 +0000 UTC m=+1094.344056483" observedRunningTime="2026-02-17 13:43:43.648463989 +0000 UTC m=+1097.759883356" watchObservedRunningTime="2026-02-17 13:43:43.651011729 +0000 UTC m=+1097.762431066" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.671545 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-9595d6797-sxtr2" podStartSLOduration=8.011535328 podStartE2EDuration="28.671524599s" podCreationTimestamp="2026-02-17 13:43:15 +0000 UTC" firstStartedPulling="2026-02-17 13:43:18.641582731 +0000 UTC m=+1072.753002068" lastFinishedPulling="2026-02-17 13:43:39.301571972 +0000 UTC m=+1093.412991339" observedRunningTime="2026-02-17 13:43:43.66960749 +0000 UTC m=+1097.781026827" watchObservedRunningTime="2026-02-17 13:43:43.671524599 +0000 UTC m=+1097.782943936" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.695711 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-nwmk5" podStartSLOduration=4.076119066 podStartE2EDuration="27.695691573s" podCreationTimestamp="2026-02-17 13:43:16 +0000 UTC" firstStartedPulling="2026-02-17 13:43:18.832390295 +0000 UTC m=+1072.943809632" lastFinishedPulling="2026-02-17 13:43:42.451962802 +0000 UTC m=+1096.563382139" observedRunningTime="2026-02-17 13:43:43.693389411 +0000 UTC m=+1097.804808748" watchObservedRunningTime="2026-02-17 13:43:43.695691573 +0000 UTC m=+1097.807110910" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.760760 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-68c6d499cb-vt6zw" podStartSLOduration=7.628074881 podStartE2EDuration="28.760741933s" podCreationTimestamp="2026-02-17 13:43:15 +0000 UTC" firstStartedPulling="2026-02-17 13:43:18.168844688 +0000 UTC m=+1072.280264025" lastFinishedPulling="2026-02-17 13:43:39.3015117 +0000 UTC m=+1093.412931077" observedRunningTime="2026-02-17 13:43:43.759414812 +0000 UTC m=+1097.870834149" watchObservedRunningTime="2026-02-17 13:43:43.760741933 +0000 UTC m=+1097.872161270" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.800480 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-cdpkr" podStartSLOduration=13.537631135 podStartE2EDuration="28.800458743s" podCreationTimestamp="2026-02-17 13:43:15 +0000 UTC" firstStartedPulling="2026-02-17 13:43:18.2140819 +0000 UTC m=+1072.325501237" lastFinishedPulling="2026-02-17 13:43:33.476909508 +0000 UTC m=+1087.588328845" observedRunningTime="2026-02-17 13:43:43.795930601 +0000 UTC m=+1097.907349938" watchObservedRunningTime="2026-02-17 13:43:43.800458743 +0000 UTC m=+1097.911878080" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.828672 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-96fff9cb8-88sh4" podStartSLOduration=8.196549562 podStartE2EDuration="28.828656133s" podCreationTimestamp="2026-02-17 13:43:15 +0000 UTC" firstStartedPulling="2026-02-17 13:43:18.669527523 +0000 UTC m=+1072.780946860" lastFinishedPulling="2026-02-17 13:43:39.301634044 +0000 UTC m=+1093.413053431" observedRunningTime="2026-02-17 13:43:43.823173352 +0000 UTC m=+1097.934592689" watchObservedRunningTime="2026-02-17 13:43:43.828656133 +0000 UTC m=+1097.940075460" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.859806 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-n6fl9" podStartSLOduration=5.297960109 podStartE2EDuration="28.859789784s" podCreationTimestamp="2026-02-17 13:43:15 +0000 UTC" firstStartedPulling="2026-02-17 13:43:18.839877019 +0000 UTC m=+1072.951296356" lastFinishedPulling="2026-02-17 13:43:42.401706684 +0000 UTC m=+1096.513126031" observedRunningTime="2026-02-17 13:43:43.852463685 +0000 UTC m=+1097.963883022" watchObservedRunningTime="2026-02-17 13:43:43.859789784 +0000 UTC m=+1097.971209121" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.915079 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-pddsh" podStartSLOduration=5.445874915 podStartE2EDuration="28.915063689s" podCreationTimestamp="2026-02-17 13:43:15 +0000 UTC" firstStartedPulling="2026-02-17 13:43:18.859258964 +0000 UTC m=+1072.970678301" lastFinishedPulling="2026-02-17 13:43:42.328447738 +0000 UTC m=+1096.439867075" observedRunningTime="2026-02-17 13:43:43.911806177 +0000 UTC m=+1098.023225514" watchObservedRunningTime="2026-02-17 13:43:43.915063689 +0000 UTC m=+1098.026483026" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.935184 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rtlpm" podStartSLOduration=4.35692392 podStartE2EDuration="27.935165247s" podCreationTimestamp="2026-02-17 13:43:16 +0000 UTC" firstStartedPulling="2026-02-17 13:43:18.858024345 +0000 UTC m=+1072.969443682" lastFinishedPulling="2026-02-17 13:43:42.436265672 +0000 UTC m=+1096.547685009" observedRunningTime="2026-02-17 13:43:43.933126812 +0000 UTC m=+1098.044546149" watchObservedRunningTime="2026-02-17 13:43:43.935165247 +0000 UTC m=+1098.046584584" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.969851 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-rbrxl" podStartSLOduration=4.260325445 podStartE2EDuration="27.969824167s" podCreationTimestamp="2026-02-17 13:43:16 +0000 UTC" firstStartedPulling="2026-02-17 13:43:18.67999565 +0000 UTC m=+1072.791414987" lastFinishedPulling="2026-02-17 13:43:42.389494372 +0000 UTC m=+1096.500913709" observedRunningTime="2026-02-17 13:43:43.966465763 +0000 UTC m=+1098.077885100" watchObservedRunningTime="2026-02-17 13:43:43.969824167 +0000 UTC m=+1098.081243504" Feb 17 13:43:43 crc kubenswrapper[4804]: I0217 13:43:43.985483 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-57746b5ff9-wn64m" podStartSLOduration=15.776543852 podStartE2EDuration="28.985463876s" podCreationTimestamp="2026-02-17 13:43:15 +0000 UTC" firstStartedPulling="2026-02-17 13:43:17.268968358 +0000 UTC m=+1071.380387695" lastFinishedPulling="2026-02-17 13:43:30.477888382 +0000 UTC m=+1084.589307719" observedRunningTime="2026-02-17 13:43:43.983883476 +0000 UTC m=+1098.095302823" watchObservedRunningTime="2026-02-17 13:43:43.985463876 +0000 UTC m=+1098.096883213" Feb 17 13:43:44 crc kubenswrapper[4804]: I0217 13:43:44.013524 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-4xvfg" podStartSLOduration=6.944299515 podStartE2EDuration="29.013501031s" podCreationTimestamp="2026-02-17 13:43:15 +0000 UTC" firstStartedPulling="2026-02-17 13:43:18.164295037 +0000 UTC m=+1072.275714374" lastFinishedPulling="2026-02-17 13:43:40.233496553 +0000 UTC m=+1094.344915890" observedRunningTime="2026-02-17 13:43:44.010537778 +0000 UTC m=+1098.121957115" watchObservedRunningTime="2026-02-17 13:43:44.013501031 +0000 UTC m=+1098.124920368" Feb 17 13:43:44 crc kubenswrapper[4804]: I0217 13:43:44.051100 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-9vbg5" podStartSLOduration=5.490696534 podStartE2EDuration="29.051076723s" podCreationTimestamp="2026-02-17 13:43:15 +0000 UTC" firstStartedPulling="2026-02-17 13:43:18.840031464 +0000 UTC m=+1072.951450801" lastFinishedPulling="2026-02-17 13:43:42.400411633 +0000 UTC m=+1096.511830990" observedRunningTime="2026-02-17 13:43:44.047179062 +0000 UTC m=+1098.158598399" watchObservedRunningTime="2026-02-17 13:43:44.051076723 +0000 UTC m=+1098.162496050" Feb 17 13:43:44 crc kubenswrapper[4804]: I0217 13:43:44.098147 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c469bc6bb-xlwmb" podStartSLOduration=7.620193813 podStartE2EDuration="28.098125222s" podCreationTimestamp="2026-02-17 13:43:16 +0000 UTC" firstStartedPulling="2026-02-17 13:43:18.823533479 +0000 UTC m=+1072.934952816" lastFinishedPulling="2026-02-17 13:43:39.301464868 +0000 UTC m=+1093.412884225" observedRunningTime="2026-02-17 13:43:44.076818746 +0000 UTC m=+1098.188238083" watchObservedRunningTime="2026-02-17 13:43:44.098125222 +0000 UTC m=+1098.209544559" Feb 17 13:43:44 crc kubenswrapper[4804]: I0217 13:43:44.140350 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-66997756f6-vkdg2" podStartSLOduration=8.064613374 podStartE2EDuration="29.140332818s" podCreationTimestamp="2026-02-17 13:43:15 +0000 UTC" firstStartedPulling="2026-02-17 13:43:18.225779215 +0000 UTC m=+1072.337198552" lastFinishedPulling="2026-02-17 13:43:39.301498659 +0000 UTC m=+1093.412917996" observedRunningTime="2026-02-17 13:43:44.137074127 +0000 UTC m=+1098.248493464" watchObservedRunningTime="2026-02-17 13:43:44.140332818 +0000 UTC m=+1098.251752155" Feb 17 13:43:47 crc kubenswrapper[4804]: I0217 13:43:47.590806 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-55cc45767f-bslfv" podStartSLOduration=17.354084359 podStartE2EDuration="32.590786442s" podCreationTimestamp="2026-02-17 13:43:15 +0000 UTC" firstStartedPulling="2026-02-17 13:43:18.240796164 +0000 UTC m=+1072.352215501" lastFinishedPulling="2026-02-17 13:43:33.477498247 +0000 UTC m=+1087.588917584" observedRunningTime="2026-02-17 13:43:44.191593448 +0000 UTC m=+1098.303012785" watchObservedRunningTime="2026-02-17 13:43:47.590786442 +0000 UTC m=+1101.702205789" Feb 17 13:43:47 crc kubenswrapper[4804]: I0217 13:43:47.675610 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert\") pod \"infra-operator-controller-manager-66d6b5f488-lrjgg\" (UID: \"bf13099a-fbab-41bf-b30c-5c6b1049af19\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" Feb 17 13:43:47 crc kubenswrapper[4804]: I0217 13:43:47.682907 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf13099a-fbab-41bf-b30c-5c6b1049af19-cert\") pod \"infra-operator-controller-manager-66d6b5f488-lrjgg\" (UID: \"bf13099a-fbab-41bf-b30c-5c6b1049af19\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" Feb 17 13:43:47 crc kubenswrapper[4804]: I0217 13:43:47.894532 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-5zgr8" Feb 17 13:43:47 crc kubenswrapper[4804]: I0217 13:43:47.903247 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" Feb 17 13:43:48 crc kubenswrapper[4804]: I0217 13:43:48.167043 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg"] Feb 17 13:43:48 crc kubenswrapper[4804]: W0217 13:43:48.174535 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf13099a_fbab_41bf_b30c_5c6b1049af19.slice/crio-d4777e2c877afdba03c760e23f0f6ea2cb34c789d130eacb89b2f570eda552a8 WatchSource:0}: Error finding container d4777e2c877afdba03c760e23f0f6ea2cb34c789d130eacb89b2f570eda552a8: Status 404 returned error can't find the container with id d4777e2c877afdba03c760e23f0f6ea2cb34c789d130eacb89b2f570eda552a8 Feb 17 13:43:48 crc kubenswrapper[4804]: I0217 13:43:48.182011 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88\" (UID: \"ae7598b8-fff5-4044-bbd7-0c8f2f60eed8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" Feb 17 13:43:48 crc kubenswrapper[4804]: I0217 13:43:48.190259 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae7598b8-fff5-4044-bbd7-0c8f2f60eed8-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88\" (UID: \"ae7598b8-fff5-4044-bbd7-0c8f2f60eed8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" Feb 17 13:43:48 crc kubenswrapper[4804]: I0217 13:43:48.387044 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:48 crc kubenswrapper[4804]: I0217 13:43:48.387119 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:48 crc kubenswrapper[4804]: I0217 13:43:48.393186 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-webhook-certs\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:48 crc kubenswrapper[4804]: I0217 13:43:48.393192 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8155784a-3945-4ca3-aa9a-b0e089ffac52-metrics-certs\") pod \"openstack-operator-controller-manager-5744df64c-mkkrv\" (UID: \"8155784a-3945-4ca3-aa9a-b0e089ffac52\") " pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:48 crc kubenswrapper[4804]: I0217 13:43:48.456992 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-pxc28" Feb 17 13:43:48 crc kubenswrapper[4804]: I0217 13:43:48.465705 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" Feb 17 13:43:48 crc kubenswrapper[4804]: I0217 13:43:48.668354 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-vp69j" Feb 17 13:43:48 crc kubenswrapper[4804]: I0217 13:43:48.672798 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:48 crc kubenswrapper[4804]: I0217 13:43:48.693557 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" event={"ID":"bf13099a-fbab-41bf-b30c-5c6b1049af19","Type":"ContainerStarted","Data":"d4777e2c877afdba03c760e23f0f6ea2cb34c789d130eacb89b2f570eda552a8"} Feb 17 13:43:48 crc kubenswrapper[4804]: I0217 13:43:48.706723 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-745bbbd77b-ptrs5" event={"ID":"79eb8fb0-6207-44c8-b3c2-a00116bcf10b","Type":"ContainerStarted","Data":"ba6bbf402b813b513db0f5658ae5782c38827546c0e2fdbcbc21df2da06cdb40"} Feb 17 13:43:48 crc kubenswrapper[4804]: I0217 13:43:48.707114 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-745bbbd77b-ptrs5" Feb 17 13:43:48 crc kubenswrapper[4804]: I0217 13:43:48.708323 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-l5cl2" event={"ID":"97925efc-eb46-4a60-b372-b31f13a2c876","Type":"ContainerStarted","Data":"bdf336cc2fa5990df3f5147256d51e7daef34e921983c80e8bfca1deb02eeaf0"} Feb 17 13:43:48 crc kubenswrapper[4804]: I0217 13:43:48.708687 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-l5cl2" Feb 17 13:43:48 crc kubenswrapper[4804]: I0217 13:43:48.794828 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-745bbbd77b-ptrs5" podStartSLOduration=4.331396857 podStartE2EDuration="33.794811755s" podCreationTimestamp="2026-02-17 13:43:15 +0000 UTC" firstStartedPulling="2026-02-17 13:43:18.648080173 +0000 UTC m=+1072.759499520" lastFinishedPulling="2026-02-17 13:43:48.111495071 +0000 UTC m=+1102.222914418" observedRunningTime="2026-02-17 13:43:48.771311451 +0000 UTC m=+1102.882730788" watchObservedRunningTime="2026-02-17 13:43:48.794811755 +0000 UTC m=+1102.906231092" Feb 17 13:43:48 crc kubenswrapper[4804]: I0217 13:43:48.800705 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-l5cl2" podStartSLOduration=4.352108483 podStartE2EDuration="33.800687038s" podCreationTimestamp="2026-02-17 13:43:15 +0000 UTC" firstStartedPulling="2026-02-17 13:43:18.665123375 +0000 UTC m=+1072.776542712" lastFinishedPulling="2026-02-17 13:43:48.11370193 +0000 UTC m=+1102.225121267" observedRunningTime="2026-02-17 13:43:48.792686089 +0000 UTC m=+1102.904105416" watchObservedRunningTime="2026-02-17 13:43:48.800687038 +0000 UTC m=+1102.912106375" Feb 17 13:43:48 crc kubenswrapper[4804]: I0217 13:43:48.933960 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88"] Feb 17 13:43:49 crc kubenswrapper[4804]: I0217 13:43:49.376964 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv"] Feb 17 13:43:49 crc kubenswrapper[4804]: I0217 13:43:49.720752 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" event={"ID":"8155784a-3945-4ca3-aa9a-b0e089ffac52","Type":"ContainerStarted","Data":"394848d12a2dc6457a84641291c2f16ae2b72738a3eddca98e7396c1d76144b7"} Feb 17 13:43:49 crc kubenswrapper[4804]: I0217 13:43:49.720792 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" event={"ID":"8155784a-3945-4ca3-aa9a-b0e089ffac52","Type":"ContainerStarted","Data":"37d16acca3dce20bf640eb09ad4a1d3b7c7f55ae3357d6c4c27ffc68339708a5"} Feb 17 13:43:49 crc kubenswrapper[4804]: I0217 13:43:49.720881 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:43:49 crc kubenswrapper[4804]: I0217 13:43:49.722282 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" event={"ID":"ae7598b8-fff5-4044-bbd7-0c8f2f60eed8","Type":"ContainerStarted","Data":"8a63ef5863bc879a6e07d6b2c149cb1b992e69f989c426a9b5decb1ba81836be"} Feb 17 13:43:49 crc kubenswrapper[4804]: I0217 13:43:49.724056 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5ddd85db87-c8hmm" event={"ID":"36b1ca46-becb-417e-b05e-777d40246cb6","Type":"ContainerStarted","Data":"ea6df02621ab67ca7f43e12e1b81f1bc560357f71096a1d2999dac2cb8ad5000"} Feb 17 13:43:49 crc kubenswrapper[4804]: I0217 13:43:49.724424 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5ddd85db87-c8hmm" Feb 17 13:43:49 crc kubenswrapper[4804]: I0217 13:43:49.749361 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" podStartSLOduration=33.749346142 podStartE2EDuration="33.749346142s" podCreationTimestamp="2026-02-17 13:43:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:43:49.745706958 +0000 UTC m=+1103.857126295" watchObservedRunningTime="2026-02-17 13:43:49.749346142 +0000 UTC m=+1103.860765479" Feb 17 13:43:49 crc kubenswrapper[4804]: I0217 13:43:49.769309 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5ddd85db87-c8hmm" podStartSLOduration=4.375647137 podStartE2EDuration="34.769288164s" podCreationTimestamp="2026-02-17 13:43:15 +0000 UTC" firstStartedPulling="2026-02-17 13:43:18.648049942 +0000 UTC m=+1072.759469279" lastFinishedPulling="2026-02-17 13:43:49.041690969 +0000 UTC m=+1103.153110306" observedRunningTime="2026-02-17 13:43:49.765850827 +0000 UTC m=+1103.877270154" watchObservedRunningTime="2026-02-17 13:43:49.769288164 +0000 UTC m=+1103.880707501" Feb 17 13:43:55 crc kubenswrapper[4804]: I0217 13:43:55.796967 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-4xvfg" Feb 17 13:43:55 crc kubenswrapper[4804]: I0217 13:43:55.810339 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-57746b5ff9-wn64m" Feb 17 13:43:55 crc kubenswrapper[4804]: I0217 13:43:55.888449 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-55cc45767f-bslfv" Feb 17 13:43:55 crc kubenswrapper[4804]: I0217 13:43:55.907046 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-68c6d499cb-vt6zw" Feb 17 13:43:55 crc kubenswrapper[4804]: I0217 13:43:55.923850 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-9595d6797-sxtr2" Feb 17 13:43:55 crc kubenswrapper[4804]: I0217 13:43:55.964542 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-66997756f6-vkdg2" Feb 17 13:43:55 crc kubenswrapper[4804]: I0217 13:43:55.974889 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-t6hlr" Feb 17 13:43:56 crc kubenswrapper[4804]: I0217 13:43:56.127553 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-cdpkr" Feb 17 13:43:56 crc kubenswrapper[4804]: I0217 13:43:56.224961 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-pddsh" Feb 17 13:43:56 crc kubenswrapper[4804]: I0217 13:43:56.242226 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-96fff9cb8-88sh4" Feb 17 13:43:56 crc kubenswrapper[4804]: I0217 13:43:56.284989 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-l5cl2" Feb 17 13:43:56 crc kubenswrapper[4804]: I0217 13:43:56.319615 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5ddd85db87-c8hmm" Feb 17 13:43:56 crc kubenswrapper[4804]: I0217 13:43:56.355765 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-745bbbd77b-ptrs5" Feb 17 13:43:56 crc kubenswrapper[4804]: I0217 13:43:56.561305 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-85c99d655-ltwrc" Feb 17 13:43:56 crc kubenswrapper[4804]: I0217 13:43:56.679437 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-n6fl9" Feb 17 13:43:56 crc kubenswrapper[4804]: I0217 13:43:56.782841 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-9vbg5" Feb 17 13:43:56 crc kubenswrapper[4804]: I0217 13:43:56.995857 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-nwmk5" Feb 17 13:43:57 crc kubenswrapper[4804]: I0217 13:43:57.103977 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-rbrxl" Feb 17 13:43:57 crc kubenswrapper[4804]: I0217 13:43:57.132970 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c469bc6bb-xlwmb" Feb 17 13:43:58 crc kubenswrapper[4804]: I0217 13:43:58.682365 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5744df64c-mkkrv" Feb 17 13:44:00 crc kubenswrapper[4804]: E0217 13:44:00.234997 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:d8f38654cb385d3ff582419746c3d68d64c43cea412622f0e5dfcb32ee5ab47b" Feb 17 13:44:00 crc kubenswrapper[4804]: E0217 13:44:00.235708 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:d8f38654cb385d3ff582419746c3d68d64c43cea412622f0e5dfcb32ee5ab47b,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:18.0-fr5-latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:18.0-fr5-latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_API_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_PROC_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-processor:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:18.0-fr5-latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fl5dp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88_openstack-operators(ae7598b8-fff5-4044-bbd7-0c8f2f60eed8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 13:44:00 crc kubenswrapper[4804]: E0217 13:44:00.238621 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" podUID="ae7598b8-fff5-4044-bbd7-0c8f2f60eed8" Feb 17 13:44:00 crc kubenswrapper[4804]: E0217 13:44:00.750400 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/infra-operator@sha256:fc76cfd501345b5e18ddf48006aa04bcb4cb4020acd83894ed7c4fc952c0232a" Feb 17 13:44:00 crc kubenswrapper[4804]: E0217 13:44:00.750582 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:fc76cfd501345b5e18ddf48006aa04bcb4cb4020acd83894ed7c4fc952c0232a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b8dcg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-66d6b5f488-lrjgg_openstack-operators(bf13099a-fbab-41bf-b30c-5c6b1049af19): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 13:44:00 crc kubenswrapper[4804]: E0217 13:44:00.751977 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" podUID="bf13099a-fbab-41bf-b30c-5c6b1049af19" Feb 17 13:44:00 crc kubenswrapper[4804]: E0217 13:44:00.823033 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:fc76cfd501345b5e18ddf48006aa04bcb4cb4020acd83894ed7c4fc952c0232a\\\"\"" pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" podUID="bf13099a-fbab-41bf-b30c-5c6b1049af19" Feb 17 13:44:00 crc kubenswrapper[4804]: E0217 13:44:00.824536 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:d8f38654cb385d3ff582419746c3d68d64c43cea412622f0e5dfcb32ee5ab47b\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" podUID="ae7598b8-fff5-4044-bbd7-0c8f2f60eed8" Feb 17 13:44:14 crc kubenswrapper[4804]: I0217 13:44:14.920395 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" event={"ID":"bf13099a-fbab-41bf-b30c-5c6b1049af19","Type":"ContainerStarted","Data":"146daf9882dc74acee691f78070232ae2e28634122daa5632d2d466b0cac1b7e"} Feb 17 13:44:14 crc kubenswrapper[4804]: I0217 13:44:14.921356 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" Feb 17 13:44:14 crc kubenswrapper[4804]: I0217 13:44:14.946564 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" podStartSLOduration=34.03729189 podStartE2EDuration="59.946533787s" podCreationTimestamp="2026-02-17 13:43:15 +0000 UTC" firstStartedPulling="2026-02-17 13:43:48.176680316 +0000 UTC m=+1102.288099653" lastFinishedPulling="2026-02-17 13:44:14.085922213 +0000 UTC m=+1128.197341550" observedRunningTime="2026-02-17 13:44:14.940603071 +0000 UTC m=+1129.052022418" watchObservedRunningTime="2026-02-17 13:44:14.946533787 +0000 UTC m=+1129.057953174" Feb 17 13:44:16 crc kubenswrapper[4804]: I0217 13:44:16.943833 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" event={"ID":"ae7598b8-fff5-4044-bbd7-0c8f2f60eed8","Type":"ContainerStarted","Data":"02b984c9776b998e6e477d37b7d27a0322000916d31ba350356331a3fc9f3763"} Feb 17 13:44:16 crc kubenswrapper[4804]: I0217 13:44:16.944365 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" Feb 17 13:44:16 crc kubenswrapper[4804]: I0217 13:44:16.975764 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" podStartSLOduration=34.927563886 podStartE2EDuration="1m1.975739757s" podCreationTimestamp="2026-02-17 13:43:15 +0000 UTC" firstStartedPulling="2026-02-17 13:43:48.953403013 +0000 UTC m=+1103.064822350" lastFinishedPulling="2026-02-17 13:44:16.001578884 +0000 UTC m=+1130.112998221" observedRunningTime="2026-02-17 13:44:16.969526862 +0000 UTC m=+1131.080946239" watchObservedRunningTime="2026-02-17 13:44:16.975739757 +0000 UTC m=+1131.087159114" Feb 17 13:44:27 crc kubenswrapper[4804]: I0217 13:44:27.909690 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-lrjgg" Feb 17 13:44:28 crc kubenswrapper[4804]: I0217 13:44:28.472154 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88" Feb 17 13:44:46 crc kubenswrapper[4804]: I0217 13:44:46.931172 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rftlc"] Feb 17 13:44:46 crc kubenswrapper[4804]: I0217 13:44:46.934008 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-rftlc" Feb 17 13:44:46 crc kubenswrapper[4804]: I0217 13:44:46.940628 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 17 13:44:46 crc kubenswrapper[4804]: I0217 13:44:46.940930 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-sgq2d" Feb 17 13:44:46 crc kubenswrapper[4804]: I0217 13:44:46.941392 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 17 13:44:46 crc kubenswrapper[4804]: I0217 13:44:46.943115 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 17 13:44:46 crc kubenswrapper[4804]: I0217 13:44:46.946673 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rftlc"] Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.018075 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-c8wp7"] Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.019218 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-c8wp7" Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.022395 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.029751 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-c8wp7"] Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.068322 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs5bs\" (UniqueName: \"kubernetes.io/projected/13452752-6880-43b4-9a63-8768d0afa122-kube-api-access-gs5bs\") pod \"dnsmasq-dns-675f4bcbfc-rftlc\" (UID: \"13452752-6880-43b4-9a63-8768d0afa122\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rftlc" Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.068376 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13452752-6880-43b4-9a63-8768d0afa122-config\") pod \"dnsmasq-dns-675f4bcbfc-rftlc\" (UID: \"13452752-6880-43b4-9a63-8768d0afa122\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rftlc" Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.169612 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87f6a03c-039e-4107-985b-803f59ccfb89-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-c8wp7\" (UID: \"87f6a03c-039e-4107-985b-803f59ccfb89\") " pod="openstack/dnsmasq-dns-78dd6ddcc-c8wp7" Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.169709 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxjdq\" (UniqueName: \"kubernetes.io/projected/87f6a03c-039e-4107-985b-803f59ccfb89-kube-api-access-lxjdq\") pod \"dnsmasq-dns-78dd6ddcc-c8wp7\" (UID: \"87f6a03c-039e-4107-985b-803f59ccfb89\") " pod="openstack/dnsmasq-dns-78dd6ddcc-c8wp7" Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.169754 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87f6a03c-039e-4107-985b-803f59ccfb89-config\") pod \"dnsmasq-dns-78dd6ddcc-c8wp7\" (UID: \"87f6a03c-039e-4107-985b-803f59ccfb89\") " pod="openstack/dnsmasq-dns-78dd6ddcc-c8wp7" Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.169804 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs5bs\" (UniqueName: \"kubernetes.io/projected/13452752-6880-43b4-9a63-8768d0afa122-kube-api-access-gs5bs\") pod \"dnsmasq-dns-675f4bcbfc-rftlc\" (UID: \"13452752-6880-43b4-9a63-8768d0afa122\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rftlc" Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.169863 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13452752-6880-43b4-9a63-8768d0afa122-config\") pod \"dnsmasq-dns-675f4bcbfc-rftlc\" (UID: \"13452752-6880-43b4-9a63-8768d0afa122\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rftlc" Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.170875 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13452752-6880-43b4-9a63-8768d0afa122-config\") pod \"dnsmasq-dns-675f4bcbfc-rftlc\" (UID: \"13452752-6880-43b4-9a63-8768d0afa122\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rftlc" Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.195484 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs5bs\" (UniqueName: \"kubernetes.io/projected/13452752-6880-43b4-9a63-8768d0afa122-kube-api-access-gs5bs\") pod \"dnsmasq-dns-675f4bcbfc-rftlc\" (UID: \"13452752-6880-43b4-9a63-8768d0afa122\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rftlc" Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.251892 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-rftlc" Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.270607 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87f6a03c-039e-4107-985b-803f59ccfb89-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-c8wp7\" (UID: \"87f6a03c-039e-4107-985b-803f59ccfb89\") " pod="openstack/dnsmasq-dns-78dd6ddcc-c8wp7" Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.270697 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxjdq\" (UniqueName: \"kubernetes.io/projected/87f6a03c-039e-4107-985b-803f59ccfb89-kube-api-access-lxjdq\") pod \"dnsmasq-dns-78dd6ddcc-c8wp7\" (UID: \"87f6a03c-039e-4107-985b-803f59ccfb89\") " pod="openstack/dnsmasq-dns-78dd6ddcc-c8wp7" Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.270735 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87f6a03c-039e-4107-985b-803f59ccfb89-config\") pod \"dnsmasq-dns-78dd6ddcc-c8wp7\" (UID: \"87f6a03c-039e-4107-985b-803f59ccfb89\") " pod="openstack/dnsmasq-dns-78dd6ddcc-c8wp7" Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.271742 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87f6a03c-039e-4107-985b-803f59ccfb89-config\") pod \"dnsmasq-dns-78dd6ddcc-c8wp7\" (UID: \"87f6a03c-039e-4107-985b-803f59ccfb89\") " pod="openstack/dnsmasq-dns-78dd6ddcc-c8wp7" Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.271867 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87f6a03c-039e-4107-985b-803f59ccfb89-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-c8wp7\" (UID: \"87f6a03c-039e-4107-985b-803f59ccfb89\") " pod="openstack/dnsmasq-dns-78dd6ddcc-c8wp7" Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.290724 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxjdq\" (UniqueName: \"kubernetes.io/projected/87f6a03c-039e-4107-985b-803f59ccfb89-kube-api-access-lxjdq\") pod \"dnsmasq-dns-78dd6ddcc-c8wp7\" (UID: \"87f6a03c-039e-4107-985b-803f59ccfb89\") " pod="openstack/dnsmasq-dns-78dd6ddcc-c8wp7" Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.337953 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-c8wp7" Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.787003 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rftlc"] Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.791160 4804 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 13:44:47 crc kubenswrapper[4804]: I0217 13:44:47.865127 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-c8wp7"] Feb 17 13:44:47 crc kubenswrapper[4804]: W0217 13:44:47.872080 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87f6a03c_039e_4107_985b_803f59ccfb89.slice/crio-94f72eaaad772aaa7f7438e818024e90c7cbfcba278a93aab7cafc50db2475ad WatchSource:0}: Error finding container 94f72eaaad772aaa7f7438e818024e90c7cbfcba278a93aab7cafc50db2475ad: Status 404 returned error can't find the container with id 94f72eaaad772aaa7f7438e818024e90c7cbfcba278a93aab7cafc50db2475ad Feb 17 13:44:48 crc kubenswrapper[4804]: I0217 13:44:48.183161 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-rftlc" event={"ID":"13452752-6880-43b4-9a63-8768d0afa122","Type":"ContainerStarted","Data":"0a2ffab0e99d6480ecf94c911214dd4efd07af4dfd4133a32e88b8a9e531736b"} Feb 17 13:44:48 crc kubenswrapper[4804]: I0217 13:44:48.184659 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-c8wp7" event={"ID":"87f6a03c-039e-4107-985b-803f59ccfb89","Type":"ContainerStarted","Data":"94f72eaaad772aaa7f7438e818024e90c7cbfcba278a93aab7cafc50db2475ad"} Feb 17 13:44:49 crc kubenswrapper[4804]: I0217 13:44:49.673404 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rftlc"] Feb 17 13:44:49 crc kubenswrapper[4804]: I0217 13:44:49.707036 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-vrzhp"] Feb 17 13:44:49 crc kubenswrapper[4804]: I0217 13:44:49.708117 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-vrzhp" Feb 17 13:44:49 crc kubenswrapper[4804]: I0217 13:44:49.752975 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-vrzhp"] Feb 17 13:44:49 crc kubenswrapper[4804]: I0217 13:44:49.819951 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv9pl\" (UniqueName: \"kubernetes.io/projected/8175f453-b68b-4236-844d-ff723515fe63-kube-api-access-kv9pl\") pod \"dnsmasq-dns-666b6646f7-vrzhp\" (UID: \"8175f453-b68b-4236-844d-ff723515fe63\") " pod="openstack/dnsmasq-dns-666b6646f7-vrzhp" Feb 17 13:44:49 crc kubenswrapper[4804]: I0217 13:44:49.820001 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8175f453-b68b-4236-844d-ff723515fe63-dns-svc\") pod \"dnsmasq-dns-666b6646f7-vrzhp\" (UID: \"8175f453-b68b-4236-844d-ff723515fe63\") " pod="openstack/dnsmasq-dns-666b6646f7-vrzhp" Feb 17 13:44:49 crc kubenswrapper[4804]: I0217 13:44:49.820023 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8175f453-b68b-4236-844d-ff723515fe63-config\") pod \"dnsmasq-dns-666b6646f7-vrzhp\" (UID: \"8175f453-b68b-4236-844d-ff723515fe63\") " pod="openstack/dnsmasq-dns-666b6646f7-vrzhp" Feb 17 13:44:49 crc kubenswrapper[4804]: I0217 13:44:49.921116 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8175f453-b68b-4236-844d-ff723515fe63-dns-svc\") pod \"dnsmasq-dns-666b6646f7-vrzhp\" (UID: \"8175f453-b68b-4236-844d-ff723515fe63\") " pod="openstack/dnsmasq-dns-666b6646f7-vrzhp" Feb 17 13:44:49 crc kubenswrapper[4804]: I0217 13:44:49.921162 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv9pl\" (UniqueName: \"kubernetes.io/projected/8175f453-b68b-4236-844d-ff723515fe63-kube-api-access-kv9pl\") pod \"dnsmasq-dns-666b6646f7-vrzhp\" (UID: \"8175f453-b68b-4236-844d-ff723515fe63\") " pod="openstack/dnsmasq-dns-666b6646f7-vrzhp" Feb 17 13:44:49 crc kubenswrapper[4804]: I0217 13:44:49.921183 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8175f453-b68b-4236-844d-ff723515fe63-config\") pod \"dnsmasq-dns-666b6646f7-vrzhp\" (UID: \"8175f453-b68b-4236-844d-ff723515fe63\") " pod="openstack/dnsmasq-dns-666b6646f7-vrzhp" Feb 17 13:44:49 crc kubenswrapper[4804]: I0217 13:44:49.922057 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8175f453-b68b-4236-844d-ff723515fe63-config\") pod \"dnsmasq-dns-666b6646f7-vrzhp\" (UID: \"8175f453-b68b-4236-844d-ff723515fe63\") " pod="openstack/dnsmasq-dns-666b6646f7-vrzhp" Feb 17 13:44:49 crc kubenswrapper[4804]: I0217 13:44:49.922221 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8175f453-b68b-4236-844d-ff723515fe63-dns-svc\") pod \"dnsmasq-dns-666b6646f7-vrzhp\" (UID: \"8175f453-b68b-4236-844d-ff723515fe63\") " pod="openstack/dnsmasq-dns-666b6646f7-vrzhp" Feb 17 13:44:49 crc kubenswrapper[4804]: I0217 13:44:49.965720 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv9pl\" (UniqueName: \"kubernetes.io/projected/8175f453-b68b-4236-844d-ff723515fe63-kube-api-access-kv9pl\") pod \"dnsmasq-dns-666b6646f7-vrzhp\" (UID: \"8175f453-b68b-4236-844d-ff723515fe63\") " pod="openstack/dnsmasq-dns-666b6646f7-vrzhp" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.038587 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-vrzhp" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.040770 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-c8wp7"] Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.079863 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-kqvs6"] Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.083397 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-kqvs6" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.099608 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-kqvs6"] Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.228952 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3586301a-dce2-427b-b5c4-9376e59fbf27-config\") pod \"dnsmasq-dns-57d769cc4f-kqvs6\" (UID: \"3586301a-dce2-427b-b5c4-9376e59fbf27\") " pod="openstack/dnsmasq-dns-57d769cc4f-kqvs6" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.229011 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zx6l\" (UniqueName: \"kubernetes.io/projected/3586301a-dce2-427b-b5c4-9376e59fbf27-kube-api-access-9zx6l\") pod \"dnsmasq-dns-57d769cc4f-kqvs6\" (UID: \"3586301a-dce2-427b-b5c4-9376e59fbf27\") " pod="openstack/dnsmasq-dns-57d769cc4f-kqvs6" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.229251 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3586301a-dce2-427b-b5c4-9376e59fbf27-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-kqvs6\" (UID: \"3586301a-dce2-427b-b5c4-9376e59fbf27\") " pod="openstack/dnsmasq-dns-57d769cc4f-kqvs6" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.330115 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3586301a-dce2-427b-b5c4-9376e59fbf27-config\") pod \"dnsmasq-dns-57d769cc4f-kqvs6\" (UID: \"3586301a-dce2-427b-b5c4-9376e59fbf27\") " pod="openstack/dnsmasq-dns-57d769cc4f-kqvs6" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.330544 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zx6l\" (UniqueName: \"kubernetes.io/projected/3586301a-dce2-427b-b5c4-9376e59fbf27-kube-api-access-9zx6l\") pod \"dnsmasq-dns-57d769cc4f-kqvs6\" (UID: \"3586301a-dce2-427b-b5c4-9376e59fbf27\") " pod="openstack/dnsmasq-dns-57d769cc4f-kqvs6" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.330648 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3586301a-dce2-427b-b5c4-9376e59fbf27-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-kqvs6\" (UID: \"3586301a-dce2-427b-b5c4-9376e59fbf27\") " pod="openstack/dnsmasq-dns-57d769cc4f-kqvs6" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.331488 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3586301a-dce2-427b-b5c4-9376e59fbf27-config\") pod \"dnsmasq-dns-57d769cc4f-kqvs6\" (UID: \"3586301a-dce2-427b-b5c4-9376e59fbf27\") " pod="openstack/dnsmasq-dns-57d769cc4f-kqvs6" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.331985 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3586301a-dce2-427b-b5c4-9376e59fbf27-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-kqvs6\" (UID: \"3586301a-dce2-427b-b5c4-9376e59fbf27\") " pod="openstack/dnsmasq-dns-57d769cc4f-kqvs6" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.374155 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zx6l\" (UniqueName: \"kubernetes.io/projected/3586301a-dce2-427b-b5c4-9376e59fbf27-kube-api-access-9zx6l\") pod \"dnsmasq-dns-57d769cc4f-kqvs6\" (UID: \"3586301a-dce2-427b-b5c4-9376e59fbf27\") " pod="openstack/dnsmasq-dns-57d769cc4f-kqvs6" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.479136 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-kqvs6" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.500707 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-vrzhp"] Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.864678 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.866098 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.869177 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.869254 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.869460 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.869608 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.869911 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-cxlcf" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.870070 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.870310 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 17 13:44:50 crc kubenswrapper[4804]: I0217 13:44:50.889978 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.042085 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.042146 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7705a06d-bc27-4686-9ca4-4aae248ead07-config-data\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.042185 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.042236 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrqs7\" (UniqueName: \"kubernetes.io/projected/7705a06d-bc27-4686-9ca4-4aae248ead07-kube-api-access-qrqs7\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.042273 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.042313 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7705a06d-bc27-4686-9ca4-4aae248ead07-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.042343 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.042380 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7705a06d-bc27-4686-9ca4-4aae248ead07-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.042408 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7705a06d-bc27-4686-9ca4-4aae248ead07-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.042434 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7705a06d-bc27-4686-9ca4-4aae248ead07-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.042458 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.121262 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-kqvs6"] Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.143165 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.143930 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrqs7\" (UniqueName: \"kubernetes.io/projected/7705a06d-bc27-4686-9ca4-4aae248ead07-kube-api-access-qrqs7\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.144035 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.144122 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7705a06d-bc27-4686-9ca4-4aae248ead07-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.144298 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.144398 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7705a06d-bc27-4686-9ca4-4aae248ead07-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.144495 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7705a06d-bc27-4686-9ca4-4aae248ead07-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.145400 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7705a06d-bc27-4686-9ca4-4aae248ead07-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.145518 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.145626 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.145852 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7705a06d-bc27-4686-9ca4-4aae248ead07-config-data\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.146776 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7705a06d-bc27-4686-9ca4-4aae248ead07-config-data\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.147033 4804 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.147215 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7705a06d-bc27-4686-9ca4-4aae248ead07-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.150882 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7705a06d-bc27-4686-9ca4-4aae248ead07-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.150937 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7705a06d-bc27-4686-9ca4-4aae248ead07-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.151354 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.151984 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.153128 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.154589 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.156851 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7705a06d-bc27-4686-9ca4-4aae248ead07-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.191103 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrqs7\" (UniqueName: \"kubernetes.io/projected/7705a06d-bc27-4686-9ca4-4aae248ead07-kube-api-access-qrqs7\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.222351 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.227500 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-kqvs6" event={"ID":"3586301a-dce2-427b-b5c4-9376e59fbf27","Type":"ContainerStarted","Data":"b6acb0860f5dd58b1333ac392aa371b170675172cb3eb7dbaaabc60cbdae0d1e"} Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.229074 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-vrzhp" event={"ID":"8175f453-b68b-4236-844d-ff723515fe63","Type":"ContainerStarted","Data":"6dec93dab248c776ff8091a9233f8da9e53443d47dfd060ebb89371b1dc81611"} Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.268689 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.282103 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.286858 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.289787 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.290312 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.290939 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.291052 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.291148 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.291243 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.291458 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-m99n4" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.352747 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.352810 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.352843 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxh4q\" (UniqueName: \"kubernetes.io/projected/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-kube-api-access-cxh4q\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.352908 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.352938 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.352969 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.353007 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.353033 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.353069 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.353223 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.353405 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.455351 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.455693 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxh4q\" (UniqueName: \"kubernetes.io/projected/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-kube-api-access-cxh4q\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.455777 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.455813 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.455853 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.455895 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.455924 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.455964 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.455987 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.456015 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.456401 4804 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.456916 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.457008 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.457027 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.457359 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.457696 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.458341 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.468942 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.469414 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.470400 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.472141 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.478555 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxh4q\" (UniqueName: \"kubernetes.io/projected/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-kube-api-access-cxh4q\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.485457 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.497429 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 13:44:51 crc kubenswrapper[4804]: I0217 13:44:51.624791 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.010461 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.198159 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.202217 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.203869 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.207545 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.208584 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-gf9w9" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.210038 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.214219 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.215779 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.270238 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.270300 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/49b02c8f-ff07-48f9-8012-e78dc6591499-config-data-default\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.270352 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/49b02c8f-ff07-48f9-8012-e78dc6591499-config-data-generated\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.270375 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/49b02c8f-ff07-48f9-8012-e78dc6591499-kolla-config\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.270393 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49b02c8f-ff07-48f9-8012-e78dc6591499-operator-scripts\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.270417 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbzzm\" (UniqueName: \"kubernetes.io/projected/49b02c8f-ff07-48f9-8012-e78dc6591499-kube-api-access-fbzzm\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.270435 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/49b02c8f-ff07-48f9-8012-e78dc6591499-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.270466 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b02c8f-ff07-48f9-8012-e78dc6591499-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.274125 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7705a06d-bc27-4686-9ca4-4aae248ead07","Type":"ContainerStarted","Data":"1805a02bed1d8e8fe42a7072ff53aa627c043f3fc1570707e67a0dbc0d5ed7c3"} Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.374195 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/49b02c8f-ff07-48f9-8012-e78dc6591499-config-data-default\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.374305 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/49b02c8f-ff07-48f9-8012-e78dc6591499-config-data-generated\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.374336 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/49b02c8f-ff07-48f9-8012-e78dc6591499-kolla-config\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.374354 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49b02c8f-ff07-48f9-8012-e78dc6591499-operator-scripts\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.374379 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbzzm\" (UniqueName: \"kubernetes.io/projected/49b02c8f-ff07-48f9-8012-e78dc6591499-kube-api-access-fbzzm\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.374405 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/49b02c8f-ff07-48f9-8012-e78dc6591499-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.374449 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b02c8f-ff07-48f9-8012-e78dc6591499-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.374478 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.375145 4804 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.376951 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49b02c8f-ff07-48f9-8012-e78dc6591499-operator-scripts\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.378383 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/49b02c8f-ff07-48f9-8012-e78dc6591499-config-data-default\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.379016 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/49b02c8f-ff07-48f9-8012-e78dc6591499-kolla-config\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.382241 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/49b02c8f-ff07-48f9-8012-e78dc6591499-config-data-generated\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.397542 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b02c8f-ff07-48f9-8012-e78dc6591499-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.412998 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.424183 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/49b02c8f-ff07-48f9-8012-e78dc6591499-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.424303 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbzzm\" (UniqueName: \"kubernetes.io/projected/49b02c8f-ff07-48f9-8012-e78dc6591499-kube-api-access-fbzzm\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.466044 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"49b02c8f-ff07-48f9-8012-e78dc6591499\") " pod="openstack/openstack-galera-0" Feb 17 13:44:52 crc kubenswrapper[4804]: W0217 13:44:52.501638 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc485c5b_1bf7_473f_b5b0_a55d5dd0e2ad.slice/crio-3d33f0752018a1f8bfeaf3539a14d45be119615613d9a1b94e290b0a39b198ee WatchSource:0}: Error finding container 3d33f0752018a1f8bfeaf3539a14d45be119615613d9a1b94e290b0a39b198ee: Status 404 returned error can't find the container with id 3d33f0752018a1f8bfeaf3539a14d45be119615613d9a1b94e290b0a39b198ee Feb 17 13:44:52 crc kubenswrapper[4804]: I0217 13:44:52.531928 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.207552 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.291397 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad","Type":"ContainerStarted","Data":"3d33f0752018a1f8bfeaf3539a14d45be119615613d9a1b94e290b0a39b198ee"} Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.701474 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.702583 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.705056 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-9cslz" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.705414 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.705539 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.716538 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.722116 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.760600 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.761953 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.765959 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-8zpnm" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.766158 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.766994 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.772062 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.821464 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9eb8e8f-8bd1-4f69-84ee-27213046c709-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.821521 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5ef96d0-19a6-4561-bde2-cf38e0280b39-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f5ef96d0-19a6-4561-bde2-cf38e0280b39\") " pod="openstack/memcached-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.821546 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4tpq\" (UniqueName: \"kubernetes.io/projected/f9eb8e8f-8bd1-4f69-84ee-27213046c709-kube-api-access-s4tpq\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.821563 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5ef96d0-19a6-4561-bde2-cf38e0280b39-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f5ef96d0-19a6-4561-bde2-cf38e0280b39\") " pod="openstack/memcached-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.821644 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f9eb8e8f-8bd1-4f69-84ee-27213046c709-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.821708 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7tbp\" (UniqueName: \"kubernetes.io/projected/f5ef96d0-19a6-4561-bde2-cf38e0280b39-kube-api-access-r7tbp\") pod \"memcached-0\" (UID: \"f5ef96d0-19a6-4561-bde2-cf38e0280b39\") " pod="openstack/memcached-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.821741 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9eb8e8f-8bd1-4f69-84ee-27213046c709-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.821777 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.821830 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f5ef96d0-19a6-4561-bde2-cf38e0280b39-config-data\") pod \"memcached-0\" (UID: \"f5ef96d0-19a6-4561-bde2-cf38e0280b39\") " pod="openstack/memcached-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.821855 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f5ef96d0-19a6-4561-bde2-cf38e0280b39-kolla-config\") pod \"memcached-0\" (UID: \"f5ef96d0-19a6-4561-bde2-cf38e0280b39\") " pod="openstack/memcached-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.821879 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f9eb8e8f-8bd1-4f69-84ee-27213046c709-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.821922 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f9eb8e8f-8bd1-4f69-84ee-27213046c709-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.821941 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9eb8e8f-8bd1-4f69-84ee-27213046c709-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.924554 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.924690 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f5ef96d0-19a6-4561-bde2-cf38e0280b39-config-data\") pod \"memcached-0\" (UID: \"f5ef96d0-19a6-4561-bde2-cf38e0280b39\") " pod="openstack/memcached-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.924722 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f5ef96d0-19a6-4561-bde2-cf38e0280b39-kolla-config\") pod \"memcached-0\" (UID: \"f5ef96d0-19a6-4561-bde2-cf38e0280b39\") " pod="openstack/memcached-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.925185 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f9eb8e8f-8bd1-4f69-84ee-27213046c709-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.925294 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f9eb8e8f-8bd1-4f69-84ee-27213046c709-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.925356 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9eb8e8f-8bd1-4f69-84ee-27213046c709-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.925601 4804 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.926101 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f5ef96d0-19a6-4561-bde2-cf38e0280b39-config-data\") pod \"memcached-0\" (UID: \"f5ef96d0-19a6-4561-bde2-cf38e0280b39\") " pod="openstack/memcached-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.926504 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f9eb8e8f-8bd1-4f69-84ee-27213046c709-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.927178 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f9eb8e8f-8bd1-4f69-84ee-27213046c709-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.928687 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f5ef96d0-19a6-4561-bde2-cf38e0280b39-kolla-config\") pod \"memcached-0\" (UID: \"f5ef96d0-19a6-4561-bde2-cf38e0280b39\") " pod="openstack/memcached-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.931387 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9eb8e8f-8bd1-4f69-84ee-27213046c709-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.931506 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5ef96d0-19a6-4561-bde2-cf38e0280b39-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f5ef96d0-19a6-4561-bde2-cf38e0280b39\") " pod="openstack/memcached-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.932477 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4tpq\" (UniqueName: \"kubernetes.io/projected/f9eb8e8f-8bd1-4f69-84ee-27213046c709-kube-api-access-s4tpq\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.932511 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5ef96d0-19a6-4561-bde2-cf38e0280b39-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f5ef96d0-19a6-4561-bde2-cf38e0280b39\") " pod="openstack/memcached-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.932548 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f9eb8e8f-8bd1-4f69-84ee-27213046c709-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.932587 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7tbp\" (UniqueName: \"kubernetes.io/projected/f5ef96d0-19a6-4561-bde2-cf38e0280b39-kube-api-access-r7tbp\") pod \"memcached-0\" (UID: \"f5ef96d0-19a6-4561-bde2-cf38e0280b39\") " pod="openstack/memcached-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.932622 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9eb8e8f-8bd1-4f69-84ee-27213046c709-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.934989 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9eb8e8f-8bd1-4f69-84ee-27213046c709-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.940105 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9eb8e8f-8bd1-4f69-84ee-27213046c709-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.943777 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5ef96d0-19a6-4561-bde2-cf38e0280b39-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f5ef96d0-19a6-4561-bde2-cf38e0280b39\") " pod="openstack/memcached-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.944770 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9eb8e8f-8bd1-4f69-84ee-27213046c709-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.949523 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f9eb8e8f-8bd1-4f69-84ee-27213046c709-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.960477 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5ef96d0-19a6-4561-bde2-cf38e0280b39-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f5ef96d0-19a6-4561-bde2-cf38e0280b39\") " pod="openstack/memcached-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.964096 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4tpq\" (UniqueName: \"kubernetes.io/projected/f9eb8e8f-8bd1-4f69-84ee-27213046c709-kube-api-access-s4tpq\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.967743 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7tbp\" (UniqueName: \"kubernetes.io/projected/f5ef96d0-19a6-4561-bde2-cf38e0280b39-kube-api-access-r7tbp\") pod \"memcached-0\" (UID: \"f5ef96d0-19a6-4561-bde2-cf38e0280b39\") " pod="openstack/memcached-0" Feb 17 13:44:53 crc kubenswrapper[4804]: I0217 13:44:53.992785 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f9eb8e8f-8bd1-4f69-84ee-27213046c709\") " pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:54 crc kubenswrapper[4804]: I0217 13:44:54.034129 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 17 13:44:54 crc kubenswrapper[4804]: I0217 13:44:54.080544 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 17 13:44:56 crc kubenswrapper[4804]: I0217 13:44:56.101579 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 13:44:56 crc kubenswrapper[4804]: I0217 13:44:56.103081 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 13:44:56 crc kubenswrapper[4804]: I0217 13:44:56.113348 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-q5rh2" Feb 17 13:44:56 crc kubenswrapper[4804]: I0217 13:44:56.119023 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 13:44:56 crc kubenswrapper[4804]: I0217 13:44:56.178173 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncwbr\" (UniqueName: \"kubernetes.io/projected/cae6d84c-f65f-4ab2-a733-424ea34c680d-kube-api-access-ncwbr\") pod \"kube-state-metrics-0\" (UID: \"cae6d84c-f65f-4ab2-a733-424ea34c680d\") " pod="openstack/kube-state-metrics-0" Feb 17 13:44:56 crc kubenswrapper[4804]: I0217 13:44:56.279642 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncwbr\" (UniqueName: \"kubernetes.io/projected/cae6d84c-f65f-4ab2-a733-424ea34c680d-kube-api-access-ncwbr\") pod \"kube-state-metrics-0\" (UID: \"cae6d84c-f65f-4ab2-a733-424ea34c680d\") " pod="openstack/kube-state-metrics-0" Feb 17 13:44:56 crc kubenswrapper[4804]: I0217 13:44:56.303912 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncwbr\" (UniqueName: \"kubernetes.io/projected/cae6d84c-f65f-4ab2-a733-424ea34c680d-kube-api-access-ncwbr\") pod \"kube-state-metrics-0\" (UID: \"cae6d84c-f65f-4ab2-a733-424ea34c680d\") " pod="openstack/kube-state-metrics-0" Feb 17 13:44:56 crc kubenswrapper[4804]: I0217 13:44:56.429064 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.028273 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-rzcfd"] Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.029846 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.032732 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-86dqn" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.033480 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.033836 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.048786 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rzcfd"] Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.093358 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-p4wrm"] Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.094968 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.121643 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-p4wrm"] Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.140518 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/45330d20-989c-4507-ae57-5beaee075484-var-run\") pod \"ovn-controller-ovs-p4wrm\" (UID: \"45330d20-989c-4507-ae57-5beaee075484\") " pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.140570 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xswhb\" (UniqueName: \"kubernetes.io/projected/9c049787-03d2-4679-8705-ec2cd1ad8141-kube-api-access-xswhb\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.140634 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/45330d20-989c-4507-ae57-5beaee075484-etc-ovs\") pod \"ovn-controller-ovs-p4wrm\" (UID: \"45330d20-989c-4507-ae57-5beaee075484\") " pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.140659 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c049787-03d2-4679-8705-ec2cd1ad8141-ovn-controller-tls-certs\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.140681 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvqlc\" (UniqueName: \"kubernetes.io/projected/45330d20-989c-4507-ae57-5beaee075484-kube-api-access-fvqlc\") pod \"ovn-controller-ovs-p4wrm\" (UID: \"45330d20-989c-4507-ae57-5beaee075484\") " pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.140719 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/45330d20-989c-4507-ae57-5beaee075484-var-lib\") pod \"ovn-controller-ovs-p4wrm\" (UID: \"45330d20-989c-4507-ae57-5beaee075484\") " pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.140749 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45330d20-989c-4507-ae57-5beaee075484-scripts\") pod \"ovn-controller-ovs-p4wrm\" (UID: \"45330d20-989c-4507-ae57-5beaee075484\") " pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.140781 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9c049787-03d2-4679-8705-ec2cd1ad8141-var-run\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.140815 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c049787-03d2-4679-8705-ec2cd1ad8141-scripts\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.140838 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9c049787-03d2-4679-8705-ec2cd1ad8141-var-run-ovn\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.140863 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/45330d20-989c-4507-ae57-5beaee075484-var-log\") pod \"ovn-controller-ovs-p4wrm\" (UID: \"45330d20-989c-4507-ae57-5beaee075484\") " pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.140909 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9c049787-03d2-4679-8705-ec2cd1ad8141-var-log-ovn\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.140933 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c049787-03d2-4679-8705-ec2cd1ad8141-combined-ca-bundle\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.242276 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/45330d20-989c-4507-ae57-5beaee075484-var-lib\") pod \"ovn-controller-ovs-p4wrm\" (UID: \"45330d20-989c-4507-ae57-5beaee075484\") " pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.242340 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45330d20-989c-4507-ae57-5beaee075484-scripts\") pod \"ovn-controller-ovs-p4wrm\" (UID: \"45330d20-989c-4507-ae57-5beaee075484\") " pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.242380 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9c049787-03d2-4679-8705-ec2cd1ad8141-var-run\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.242417 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c049787-03d2-4679-8705-ec2cd1ad8141-scripts\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.242446 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9c049787-03d2-4679-8705-ec2cd1ad8141-var-run-ovn\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.242469 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/45330d20-989c-4507-ae57-5beaee075484-var-log\") pod \"ovn-controller-ovs-p4wrm\" (UID: \"45330d20-989c-4507-ae57-5beaee075484\") " pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.242515 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9c049787-03d2-4679-8705-ec2cd1ad8141-var-log-ovn\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.242548 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c049787-03d2-4679-8705-ec2cd1ad8141-combined-ca-bundle\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.242588 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/45330d20-989c-4507-ae57-5beaee075484-var-run\") pod \"ovn-controller-ovs-p4wrm\" (UID: \"45330d20-989c-4507-ae57-5beaee075484\") " pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.242616 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xswhb\" (UniqueName: \"kubernetes.io/projected/9c049787-03d2-4679-8705-ec2cd1ad8141-kube-api-access-xswhb\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.242667 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/45330d20-989c-4507-ae57-5beaee075484-etc-ovs\") pod \"ovn-controller-ovs-p4wrm\" (UID: \"45330d20-989c-4507-ae57-5beaee075484\") " pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.242691 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c049787-03d2-4679-8705-ec2cd1ad8141-ovn-controller-tls-certs\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.242713 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvqlc\" (UniqueName: \"kubernetes.io/projected/45330d20-989c-4507-ae57-5beaee075484-kube-api-access-fvqlc\") pod \"ovn-controller-ovs-p4wrm\" (UID: \"45330d20-989c-4507-ae57-5beaee075484\") " pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.243946 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/45330d20-989c-4507-ae57-5beaee075484-var-lib\") pod \"ovn-controller-ovs-p4wrm\" (UID: \"45330d20-989c-4507-ae57-5beaee075484\") " pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.245737 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9c049787-03d2-4679-8705-ec2cd1ad8141-var-run\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.245738 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9c049787-03d2-4679-8705-ec2cd1ad8141-var-log-ovn\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.245769 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/45330d20-989c-4507-ae57-5beaee075484-var-run\") pod \"ovn-controller-ovs-p4wrm\" (UID: \"45330d20-989c-4507-ae57-5beaee075484\") " pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.245835 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/45330d20-989c-4507-ae57-5beaee075484-var-log\") pod \"ovn-controller-ovs-p4wrm\" (UID: \"45330d20-989c-4507-ae57-5beaee075484\") " pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.245894 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/45330d20-989c-4507-ae57-5beaee075484-etc-ovs\") pod \"ovn-controller-ovs-p4wrm\" (UID: \"45330d20-989c-4507-ae57-5beaee075484\") " pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.246040 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9c049787-03d2-4679-8705-ec2cd1ad8141-var-run-ovn\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.246574 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45330d20-989c-4507-ae57-5beaee075484-scripts\") pod \"ovn-controller-ovs-p4wrm\" (UID: \"45330d20-989c-4507-ae57-5beaee075484\") " pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.247750 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c049787-03d2-4679-8705-ec2cd1ad8141-scripts\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.251161 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c049787-03d2-4679-8705-ec2cd1ad8141-combined-ca-bundle\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.262980 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c049787-03d2-4679-8705-ec2cd1ad8141-ovn-controller-tls-certs\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.264057 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvqlc\" (UniqueName: \"kubernetes.io/projected/45330d20-989c-4507-ae57-5beaee075484-kube-api-access-fvqlc\") pod \"ovn-controller-ovs-p4wrm\" (UID: \"45330d20-989c-4507-ae57-5beaee075484\") " pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.265958 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xswhb\" (UniqueName: \"kubernetes.io/projected/9c049787-03d2-4679-8705-ec2cd1ad8141-kube-api-access-xswhb\") pod \"ovn-controller-rzcfd\" (UID: \"9c049787-03d2-4679-8705-ec2cd1ad8141\") " pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.364530 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rzcfd" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.416080 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.579622 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.581244 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.583654 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-gldrt" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.584328 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.584587 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.584651 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.585823 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.600058 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.648555 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0fc5c8da-b323-4afb-aa47-125fc63caefd-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.648874 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhhrb\" (UniqueName: \"kubernetes.io/projected/0fc5c8da-b323-4afb-aa47-125fc63caefd-kube-api-access-rhhrb\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.648908 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0fc5c8da-b323-4afb-aa47-125fc63caefd-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.648928 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fc5c8da-b323-4afb-aa47-125fc63caefd-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.648971 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fc5c8da-b323-4afb-aa47-125fc63caefd-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.648997 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fc5c8da-b323-4afb-aa47-125fc63caefd-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.649090 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fc5c8da-b323-4afb-aa47-125fc63caefd-config\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.649137 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.750488 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fc5c8da-b323-4afb-aa47-125fc63caefd-config\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.750545 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.750577 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0fc5c8da-b323-4afb-aa47-125fc63caefd-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.750614 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhhrb\" (UniqueName: \"kubernetes.io/projected/0fc5c8da-b323-4afb-aa47-125fc63caefd-kube-api-access-rhhrb\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.750630 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0fc5c8da-b323-4afb-aa47-125fc63caefd-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.750644 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fc5c8da-b323-4afb-aa47-125fc63caefd-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.750677 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fc5c8da-b323-4afb-aa47-125fc63caefd-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.750696 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fc5c8da-b323-4afb-aa47-125fc63caefd-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.751751 4804 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.751782 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fc5c8da-b323-4afb-aa47-125fc63caefd-config\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.751908 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0fc5c8da-b323-4afb-aa47-125fc63caefd-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.752727 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0fc5c8da-b323-4afb-aa47-125fc63caefd-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.755925 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fc5c8da-b323-4afb-aa47-125fc63caefd-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.756571 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fc5c8da-b323-4afb-aa47-125fc63caefd-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.761118 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fc5c8da-b323-4afb-aa47-125fc63caefd-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.769433 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.771453 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhhrb\" (UniqueName: \"kubernetes.io/projected/0fc5c8da-b323-4afb-aa47-125fc63caefd-kube-api-access-rhhrb\") pod \"ovsdbserver-nb-0\" (UID: \"0fc5c8da-b323-4afb-aa47-125fc63caefd\") " pod="openstack/ovsdbserver-nb-0" Feb 17 13:44:59 crc kubenswrapper[4804]: I0217 13:44:59.896811 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 17 13:45:00 crc kubenswrapper[4804]: I0217 13:45:00.156936 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs"] Feb 17 13:45:00 crc kubenswrapper[4804]: I0217 13:45:00.157903 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs" Feb 17 13:45:00 crc kubenswrapper[4804]: I0217 13:45:00.161357 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 13:45:00 crc kubenswrapper[4804]: I0217 13:45:00.165520 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 13:45:00 crc kubenswrapper[4804]: I0217 13:45:00.175665 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs"] Feb 17 13:45:00 crc kubenswrapper[4804]: I0217 13:45:00.262656 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f-config-volume\") pod \"collect-profiles-29522265-8m8rs\" (UID: \"c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs" Feb 17 13:45:00 crc kubenswrapper[4804]: I0217 13:45:00.262717 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qflrd\" (UniqueName: \"kubernetes.io/projected/c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f-kube-api-access-qflrd\") pod \"collect-profiles-29522265-8m8rs\" (UID: \"c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs" Feb 17 13:45:00 crc kubenswrapper[4804]: I0217 13:45:00.262748 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f-secret-volume\") pod \"collect-profiles-29522265-8m8rs\" (UID: \"c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs" Feb 17 13:45:00 crc kubenswrapper[4804]: I0217 13:45:00.364372 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f-config-volume\") pod \"collect-profiles-29522265-8m8rs\" (UID: \"c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs" Feb 17 13:45:00 crc kubenswrapper[4804]: I0217 13:45:00.364447 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qflrd\" (UniqueName: \"kubernetes.io/projected/c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f-kube-api-access-qflrd\") pod \"collect-profiles-29522265-8m8rs\" (UID: \"c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs" Feb 17 13:45:00 crc kubenswrapper[4804]: I0217 13:45:00.364477 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f-secret-volume\") pod \"collect-profiles-29522265-8m8rs\" (UID: \"c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs" Feb 17 13:45:00 crc kubenswrapper[4804]: I0217 13:45:00.365949 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f-config-volume\") pod \"collect-profiles-29522265-8m8rs\" (UID: \"c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs" Feb 17 13:45:00 crc kubenswrapper[4804]: I0217 13:45:00.367822 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f-secret-volume\") pod \"collect-profiles-29522265-8m8rs\" (UID: \"c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs" Feb 17 13:45:00 crc kubenswrapper[4804]: I0217 13:45:00.404318 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qflrd\" (UniqueName: \"kubernetes.io/projected/c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f-kube-api-access-qflrd\") pod \"collect-profiles-29522265-8m8rs\" (UID: \"c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs" Feb 17 13:45:00 crc kubenswrapper[4804]: W0217 13:45:00.432512 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49b02c8f_ff07_48f9_8012_e78dc6591499.slice/crio-7aa183590318b7c7ef0f9769c4b2d764a25c9c29b68ca5709e3ccc7e86592145 WatchSource:0}: Error finding container 7aa183590318b7c7ef0f9769c4b2d764a25c9c29b68ca5709e3ccc7e86592145: Status 404 returned error can't find the container with id 7aa183590318b7c7ef0f9769c4b2d764a25c9c29b68ca5709e3ccc7e86592145 Feb 17 13:45:00 crc kubenswrapper[4804]: I0217 13:45:00.482304 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs" Feb 17 13:45:01 crc kubenswrapper[4804]: I0217 13:45:01.358851 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"49b02c8f-ff07-48f9-8012-e78dc6591499","Type":"ContainerStarted","Data":"7aa183590318b7c7ef0f9769c4b2d764a25c9c29b68ca5709e3ccc7e86592145"} Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.613000 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.616368 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.623976 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.624277 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-mnfdx" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.624505 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.624695 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.646683 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.727811 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/10e1124a-f402-422d-a906-8d22c90d4abe-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.727874 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10e1124a-f402-422d-a906-8d22c90d4abe-config\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.727999 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.728048 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/10e1124a-f402-422d-a906-8d22c90d4abe-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.728075 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/10e1124a-f402-422d-a906-8d22c90d4abe-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.728095 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vk87\" (UniqueName: \"kubernetes.io/projected/10e1124a-f402-422d-a906-8d22c90d4abe-kube-api-access-4vk87\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.728169 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10e1124a-f402-422d-a906-8d22c90d4abe-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.728230 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/10e1124a-f402-422d-a906-8d22c90d4abe-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.829417 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.829519 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/10e1124a-f402-422d-a906-8d22c90d4abe-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.829560 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/10e1124a-f402-422d-a906-8d22c90d4abe-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.829586 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vk87\" (UniqueName: \"kubernetes.io/projected/10e1124a-f402-422d-a906-8d22c90d4abe-kube-api-access-4vk87\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.829649 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10e1124a-f402-422d-a906-8d22c90d4abe-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.829678 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/10e1124a-f402-422d-a906-8d22c90d4abe-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.829731 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/10e1124a-f402-422d-a906-8d22c90d4abe-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.829773 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10e1124a-f402-422d-a906-8d22c90d4abe-config\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.830853 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10e1124a-f402-422d-a906-8d22c90d4abe-config\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.831251 4804 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.832474 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/10e1124a-f402-422d-a906-8d22c90d4abe-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.833501 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/10e1124a-f402-422d-a906-8d22c90d4abe-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.841315 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/10e1124a-f402-422d-a906-8d22c90d4abe-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.843996 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/10e1124a-f402-422d-a906-8d22c90d4abe-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.844051 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10e1124a-f402-422d-a906-8d22c90d4abe-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.855447 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.855596 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vk87\" (UniqueName: \"kubernetes.io/projected/10e1124a-f402-422d-a906-8d22c90d4abe-kube-api-access-4vk87\") pod \"ovsdbserver-sb-0\" (UID: \"10e1124a-f402-422d-a906-8d22c90d4abe\") " pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:03 crc kubenswrapper[4804]: I0217 13:45:03.952829 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:07 crc kubenswrapper[4804]: E0217 13:45:07.544622 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Feb 17 13:45:07 crc kubenswrapper[4804]: E0217 13:45:07.545445 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qrqs7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(7705a06d-bc27-4686-9ca4-4aae248ead07): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 13:45:07 crc kubenswrapper[4804]: E0217 13:45:07.546708 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="7705a06d-bc27-4686-9ca4-4aae248ead07" Feb 17 13:45:08 crc kubenswrapper[4804]: E0217 13:45:08.415299 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="7705a06d-bc27-4686-9ca4-4aae248ead07" Feb 17 13:45:11 crc kubenswrapper[4804]: I0217 13:45:11.869701 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs"] Feb 17 13:45:13 crc kubenswrapper[4804]: W0217 13:45:13.862963 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc77ee5ee_2b38_4a70_bc28_e2cdf625ab1f.slice/crio-004599c68202e5bb23f471139692d2f94e213853179d88384fbc6aa468c034db WatchSource:0}: Error finding container 004599c68202e5bb23f471139692d2f94e213853179d88384fbc6aa468c034db: Status 404 returned error can't find the container with id 004599c68202e5bb23f471139692d2f94e213853179d88384fbc6aa468c034db Feb 17 13:45:13 crc kubenswrapper[4804]: E0217 13:45:13.918369 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 17 13:45:13 crc kubenswrapper[4804]: E0217 13:45:13.918672 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gs5bs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-rftlc_openstack(13452752-6880-43b4-9a63-8768d0afa122): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 13:45:13 crc kubenswrapper[4804]: E0217 13:45:13.919881 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-rftlc" podUID="13452752-6880-43b4-9a63-8768d0afa122" Feb 17 13:45:13 crc kubenswrapper[4804]: E0217 13:45:13.953427 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 17 13:45:13 crc kubenswrapper[4804]: E0217 13:45:13.954044 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9zx6l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-kqvs6_openstack(3586301a-dce2-427b-b5c4-9376e59fbf27): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 13:45:13 crc kubenswrapper[4804]: E0217 13:45:13.955759 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-kqvs6" podUID="3586301a-dce2-427b-b5c4-9376e59fbf27" Feb 17 13:45:13 crc kubenswrapper[4804]: E0217 13:45:13.973576 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 17 13:45:13 crc kubenswrapper[4804]: E0217 13:45:13.973821 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lxjdq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-c8wp7_openstack(87f6a03c-039e-4107-985b-803f59ccfb89): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 13:45:13 crc kubenswrapper[4804]: E0217 13:45:13.975348 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-c8wp7" podUID="87f6a03c-039e-4107-985b-803f59ccfb89" Feb 17 13:45:13 crc kubenswrapper[4804]: E0217 13:45:13.987023 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 17 13:45:13 crc kubenswrapper[4804]: E0217 13:45:13.987241 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kv9pl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-vrzhp_openstack(8175f453-b68b-4236-844d-ff723515fe63): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 13:45:13 crc kubenswrapper[4804]: E0217 13:45:13.988481 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-vrzhp" podUID="8175f453-b68b-4236-844d-ff723515fe63" Feb 17 13:45:14 crc kubenswrapper[4804]: I0217 13:45:14.341693 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 17 13:45:14 crc kubenswrapper[4804]: W0217 13:45:14.347884 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5ef96d0_19a6_4561_bde2_cf38e0280b39.slice/crio-819af1b6546c0ce28efac1d1b94b84f1e414aa1d8f9cb15d012ca3352c3f5f7c WatchSource:0}: Error finding container 819af1b6546c0ce28efac1d1b94b84f1e414aa1d8f9cb15d012ca3352c3f5f7c: Status 404 returned error can't find the container with id 819af1b6546c0ce28efac1d1b94b84f1e414aa1d8f9cb15d012ca3352c3f5f7c Feb 17 13:45:14 crc kubenswrapper[4804]: I0217 13:45:14.459680 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"f5ef96d0-19a6-4561-bde2-cf38e0280b39","Type":"ContainerStarted","Data":"819af1b6546c0ce28efac1d1b94b84f1e414aa1d8f9cb15d012ca3352c3f5f7c"} Feb 17 13:45:14 crc kubenswrapper[4804]: I0217 13:45:14.462316 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs" event={"ID":"c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f","Type":"ContainerStarted","Data":"96261c5dff8beaf5a66244a0c5555316f61e48042e355a630a22cedfabc69568"} Feb 17 13:45:14 crc kubenswrapper[4804]: I0217 13:45:14.462357 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs" event={"ID":"c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f","Type":"ContainerStarted","Data":"004599c68202e5bb23f471139692d2f94e213853179d88384fbc6aa468c034db"} Feb 17 13:45:14 crc kubenswrapper[4804]: I0217 13:45:14.465991 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"49b02c8f-ff07-48f9-8012-e78dc6591499","Type":"ContainerStarted","Data":"b93a15f86d51cc28e40802669fc1dc0ee030c02a56e4690a974969a6a5e38c99"} Feb 17 13:45:14 crc kubenswrapper[4804]: E0217 13:45:14.468227 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-kqvs6" podUID="3586301a-dce2-427b-b5c4-9376e59fbf27" Feb 17 13:45:14 crc kubenswrapper[4804]: E0217 13:45:14.468449 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-vrzhp" podUID="8175f453-b68b-4236-844d-ff723515fe63" Feb 17 13:45:14 crc kubenswrapper[4804]: I0217 13:45:14.549545 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rzcfd"] Feb 17 13:45:14 crc kubenswrapper[4804]: W0217 13:45:14.554996 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c049787_03d2_4679_8705_ec2cd1ad8141.slice/crio-6ea55292565299fc6b077dff53f76fc89ceb245b582b5f477275bb181cee652a WatchSource:0}: Error finding container 6ea55292565299fc6b077dff53f76fc89ceb245b582b5f477275bb181cee652a: Status 404 returned error can't find the container with id 6ea55292565299fc6b077dff53f76fc89ceb245b582b5f477275bb181cee652a Feb 17 13:45:14 crc kubenswrapper[4804]: I0217 13:45:14.573375 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 13:45:14 crc kubenswrapper[4804]: I0217 13:45:14.887550 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 17 13:45:14 crc kubenswrapper[4804]: W0217 13:45:14.897814 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9eb8e8f_8bd1_4f69_84ee_27213046c709.slice/crio-918ec6cc8efbea653b3df239c3b4bee7a4be058ae44f44454a023eb003e3da78 WatchSource:0}: Error finding container 918ec6cc8efbea653b3df239c3b4bee7a4be058ae44f44454a023eb003e3da78: Status 404 returned error can't find the container with id 918ec6cc8efbea653b3df239c3b4bee7a4be058ae44f44454a023eb003e3da78 Feb 17 13:45:14 crc kubenswrapper[4804]: I0217 13:45:14.946120 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-c8wp7" Feb 17 13:45:14 crc kubenswrapper[4804]: I0217 13:45:14.953691 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-rftlc" Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.024060 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 17 13:45:15 crc kubenswrapper[4804]: W0217 13:45:15.030103 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10e1124a_f402_422d_a906_8d22c90d4abe.slice/crio-ff3764a92973d34694c59227480eef6b669b269368ac766f59b74883f0f1bee8 WatchSource:0}: Error finding container ff3764a92973d34694c59227480eef6b669b269368ac766f59b74883f0f1bee8: Status 404 returned error can't find the container with id ff3764a92973d34694c59227480eef6b669b269368ac766f59b74883f0f1bee8 Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.063618 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87f6a03c-039e-4107-985b-803f59ccfb89-dns-svc\") pod \"87f6a03c-039e-4107-985b-803f59ccfb89\" (UID: \"87f6a03c-039e-4107-985b-803f59ccfb89\") " Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.063674 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87f6a03c-039e-4107-985b-803f59ccfb89-config\") pod \"87f6a03c-039e-4107-985b-803f59ccfb89\" (UID: \"87f6a03c-039e-4107-985b-803f59ccfb89\") " Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.063716 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs5bs\" (UniqueName: \"kubernetes.io/projected/13452752-6880-43b4-9a63-8768d0afa122-kube-api-access-gs5bs\") pod \"13452752-6880-43b4-9a63-8768d0afa122\" (UID: \"13452752-6880-43b4-9a63-8768d0afa122\") " Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.063800 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13452752-6880-43b4-9a63-8768d0afa122-config\") pod \"13452752-6880-43b4-9a63-8768d0afa122\" (UID: \"13452752-6880-43b4-9a63-8768d0afa122\") " Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.063836 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxjdq\" (UniqueName: \"kubernetes.io/projected/87f6a03c-039e-4107-985b-803f59ccfb89-kube-api-access-lxjdq\") pod \"87f6a03c-039e-4107-985b-803f59ccfb89\" (UID: \"87f6a03c-039e-4107-985b-803f59ccfb89\") " Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.064313 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87f6a03c-039e-4107-985b-803f59ccfb89-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "87f6a03c-039e-4107-985b-803f59ccfb89" (UID: "87f6a03c-039e-4107-985b-803f59ccfb89"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.064358 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87f6a03c-039e-4107-985b-803f59ccfb89-config" (OuterVolumeSpecName: "config") pod "87f6a03c-039e-4107-985b-803f59ccfb89" (UID: "87f6a03c-039e-4107-985b-803f59ccfb89"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.064374 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13452752-6880-43b4-9a63-8768d0afa122-config" (OuterVolumeSpecName: "config") pod "13452752-6880-43b4-9a63-8768d0afa122" (UID: "13452752-6880-43b4-9a63-8768d0afa122"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.067656 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87f6a03c-039e-4107-985b-803f59ccfb89-kube-api-access-lxjdq" (OuterVolumeSpecName: "kube-api-access-lxjdq") pod "87f6a03c-039e-4107-985b-803f59ccfb89" (UID: "87f6a03c-039e-4107-985b-803f59ccfb89"). InnerVolumeSpecName "kube-api-access-lxjdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.067865 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13452752-6880-43b4-9a63-8768d0afa122-kube-api-access-gs5bs" (OuterVolumeSpecName: "kube-api-access-gs5bs") pod "13452752-6880-43b4-9a63-8768d0afa122" (UID: "13452752-6880-43b4-9a63-8768d0afa122"). InnerVolumeSpecName "kube-api-access-gs5bs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.165907 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13452752-6880-43b4-9a63-8768d0afa122-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.165955 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxjdq\" (UniqueName: \"kubernetes.io/projected/87f6a03c-039e-4107-985b-803f59ccfb89-kube-api-access-lxjdq\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.165971 4804 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87f6a03c-039e-4107-985b-803f59ccfb89-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.165983 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87f6a03c-039e-4107-985b-803f59ccfb89-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.165996 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gs5bs\" (UniqueName: \"kubernetes.io/projected/13452752-6880-43b4-9a63-8768d0afa122-kube-api-access-gs5bs\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.475474 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs" event={"ID":"c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f","Type":"ContainerDied","Data":"96261c5dff8beaf5a66244a0c5555316f61e48042e355a630a22cedfabc69568"} Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.475299 4804 generic.go:334] "Generic (PLEG): container finished" podID="c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f" containerID="96261c5dff8beaf5a66244a0c5555316f61e48042e355a630a22cedfabc69568" exitCode=0 Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.480688 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-rftlc" Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.480733 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-rftlc" event={"ID":"13452752-6880-43b4-9a63-8768d0afa122","Type":"ContainerDied","Data":"0a2ffab0e99d6480ecf94c911214dd4efd07af4dfd4133a32e88b8a9e531736b"} Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.483661 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cae6d84c-f65f-4ab2-a733-424ea34c680d","Type":"ContainerStarted","Data":"6af0a26e9132d4c61e6cb494719994825c6ff8368e85c8ef8c51fa4c2767ffd0"} Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.488282 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-c8wp7" Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.488293 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-c8wp7" event={"ID":"87f6a03c-039e-4107-985b-803f59ccfb89","Type":"ContainerDied","Data":"94f72eaaad772aaa7f7438e818024e90c7cbfcba278a93aab7cafc50db2475ad"} Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.491245 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f9eb8e8f-8bd1-4f69-84ee-27213046c709","Type":"ContainerStarted","Data":"4dad789ec862994f5efea14d5772a174e7195a22623c58ea7121822318679542"} Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.491287 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f9eb8e8f-8bd1-4f69-84ee-27213046c709","Type":"ContainerStarted","Data":"918ec6cc8efbea653b3df239c3b4bee7a4be058ae44f44454a023eb003e3da78"} Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.494829 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"10e1124a-f402-422d-a906-8d22c90d4abe","Type":"ContainerStarted","Data":"ff3764a92973d34694c59227480eef6b669b269368ac766f59b74883f0f1bee8"} Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.501772 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad","Type":"ContainerStarted","Data":"de02dbb74f45601647c918b390d5f93cfff604870702fca3316aca846c6db162"} Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.504671 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rzcfd" event={"ID":"9c049787-03d2-4679-8705-ec2cd1ad8141","Type":"ContainerStarted","Data":"6ea55292565299fc6b077dff53f76fc89ceb245b582b5f477275bb181cee652a"} Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.569298 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-c8wp7"] Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.579165 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-c8wp7"] Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.696321 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rftlc"] Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.795559 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rftlc"] Feb 17 13:45:15 crc kubenswrapper[4804]: I0217 13:45:15.924847 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 17 13:45:16 crc kubenswrapper[4804]: I0217 13:45:16.095398 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-p4wrm"] Feb 17 13:45:16 crc kubenswrapper[4804]: W0217 13:45:16.246731 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45330d20_989c_4507_ae57_5beaee075484.slice/crio-a92ca6162bb828356f877ee3020dc0b6e595dcb735158f701f58b2ba7a393d53 WatchSource:0}: Error finding container a92ca6162bb828356f877ee3020dc0b6e595dcb735158f701f58b2ba7a393d53: Status 404 returned error can't find the container with id a92ca6162bb828356f877ee3020dc0b6e595dcb735158f701f58b2ba7a393d53 Feb 17 13:45:16 crc kubenswrapper[4804]: I0217 13:45:16.312281 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs" Feb 17 13:45:16 crc kubenswrapper[4804]: I0217 13:45:16.507362 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f-secret-volume\") pod \"c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f\" (UID: \"c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f\") " Feb 17 13:45:16 crc kubenswrapper[4804]: I0217 13:45:16.507474 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f-config-volume\") pod \"c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f\" (UID: \"c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f\") " Feb 17 13:45:16 crc kubenswrapper[4804]: I0217 13:45:16.507579 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qflrd\" (UniqueName: \"kubernetes.io/projected/c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f-kube-api-access-qflrd\") pod \"c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f\" (UID: \"c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f\") " Feb 17 13:45:16 crc kubenswrapper[4804]: I0217 13:45:16.508959 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f-config-volume" (OuterVolumeSpecName: "config-volume") pod "c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f" (UID: "c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:16 crc kubenswrapper[4804]: I0217 13:45:16.515776 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f" (UID: "c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:45:16 crc kubenswrapper[4804]: I0217 13:45:16.517378 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-p4wrm" event={"ID":"45330d20-989c-4507-ae57-5beaee075484","Type":"ContainerStarted","Data":"a92ca6162bb828356f877ee3020dc0b6e595dcb735158f701f58b2ba7a393d53"} Feb 17 13:45:16 crc kubenswrapper[4804]: I0217 13:45:16.518664 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0fc5c8da-b323-4afb-aa47-125fc63caefd","Type":"ContainerStarted","Data":"d6d5305f6f8a461927703285f13aa4b342733e9de1167ba86afdd469ef338742"} Feb 17 13:45:16 crc kubenswrapper[4804]: I0217 13:45:16.521738 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs" Feb 17 13:45:16 crc kubenswrapper[4804]: I0217 13:45:16.521857 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs" event={"ID":"c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f","Type":"ContainerDied","Data":"004599c68202e5bb23f471139692d2f94e213853179d88384fbc6aa468c034db"} Feb 17 13:45:16 crc kubenswrapper[4804]: I0217 13:45:16.521876 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="004599c68202e5bb23f471139692d2f94e213853179d88384fbc6aa468c034db" Feb 17 13:45:16 crc kubenswrapper[4804]: I0217 13:45:16.527420 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f-kube-api-access-qflrd" (OuterVolumeSpecName: "kube-api-access-qflrd") pod "c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f" (UID: "c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f"). InnerVolumeSpecName "kube-api-access-qflrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:45:16 crc kubenswrapper[4804]: I0217 13:45:16.593221 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13452752-6880-43b4-9a63-8768d0afa122" path="/var/lib/kubelet/pods/13452752-6880-43b4-9a63-8768d0afa122/volumes" Feb 17 13:45:16 crc kubenswrapper[4804]: I0217 13:45:16.593920 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87f6a03c-039e-4107-985b-803f59ccfb89" path="/var/lib/kubelet/pods/87f6a03c-039e-4107-985b-803f59ccfb89/volumes" Feb 17 13:45:16 crc kubenswrapper[4804]: I0217 13:45:16.609423 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qflrd\" (UniqueName: \"kubernetes.io/projected/c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f-kube-api-access-qflrd\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:16 crc kubenswrapper[4804]: I0217 13:45:16.609469 4804 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:16 crc kubenswrapper[4804]: I0217 13:45:16.609482 4804 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:18 crc kubenswrapper[4804]: I0217 13:45:18.536660 4804 generic.go:334] "Generic (PLEG): container finished" podID="49b02c8f-ff07-48f9-8012-e78dc6591499" containerID="b93a15f86d51cc28e40802669fc1dc0ee030c02a56e4690a974969a6a5e38c99" exitCode=0 Feb 17 13:45:18 crc kubenswrapper[4804]: I0217 13:45:18.536781 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"49b02c8f-ff07-48f9-8012-e78dc6591499","Type":"ContainerDied","Data":"b93a15f86d51cc28e40802669fc1dc0ee030c02a56e4690a974969a6a5e38c99"} Feb 17 13:45:19 crc kubenswrapper[4804]: I0217 13:45:19.546169 4804 generic.go:334] "Generic (PLEG): container finished" podID="f9eb8e8f-8bd1-4f69-84ee-27213046c709" containerID="4dad789ec862994f5efea14d5772a174e7195a22623c58ea7121822318679542" exitCode=0 Feb 17 13:45:19 crc kubenswrapper[4804]: I0217 13:45:19.546254 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f9eb8e8f-8bd1-4f69-84ee-27213046c709","Type":"ContainerDied","Data":"4dad789ec862994f5efea14d5772a174e7195a22623c58ea7121822318679542"} Feb 17 13:45:21 crc kubenswrapper[4804]: I0217 13:45:21.603954 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"10e1124a-f402-422d-a906-8d22c90d4abe","Type":"ContainerStarted","Data":"a5c4a5ce7132a270b2b5975f3fb68551b963040bcebfde546b06f8fa1f907bb6"} Feb 17 13:45:21 crc kubenswrapper[4804]: I0217 13:45:21.606478 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"f5ef96d0-19a6-4561-bde2-cf38e0280b39","Type":"ContainerStarted","Data":"48c3b4c65a16ba5ebe3448b8348b8660299955c67115f87eae6f949edef29da2"} Feb 17 13:45:21 crc kubenswrapper[4804]: I0217 13:45:21.606644 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 17 13:45:21 crc kubenswrapper[4804]: I0217 13:45:21.608221 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0fc5c8da-b323-4afb-aa47-125fc63caefd","Type":"ContainerStarted","Data":"a19ba278c8655bc8d0ada49c717a14ce1984144439dfb2b337adf3b24c18dd11"} Feb 17 13:45:21 crc kubenswrapper[4804]: I0217 13:45:21.612047 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rzcfd" event={"ID":"9c049787-03d2-4679-8705-ec2cd1ad8141","Type":"ContainerStarted","Data":"708395c123e89d382895495e94d97e1a95dc8f67a8cb757f1413c13804265e38"} Feb 17 13:45:21 crc kubenswrapper[4804]: I0217 13:45:21.612868 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-rzcfd" Feb 17 13:45:21 crc kubenswrapper[4804]: I0217 13:45:21.614514 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cae6d84c-f65f-4ab2-a733-424ea34c680d","Type":"ContainerStarted","Data":"5a806d3dffc413a5deaa4f1d154f348db2fd3905e9efed500b9b4abe1bb8cfa0"} Feb 17 13:45:21 crc kubenswrapper[4804]: I0217 13:45:21.614572 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 17 13:45:21 crc kubenswrapper[4804]: I0217 13:45:21.617495 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f9eb8e8f-8bd1-4f69-84ee-27213046c709","Type":"ContainerStarted","Data":"39996e72e1656146c1fd21d8c62c9541f69f078226387f63ab48de9f15249bc6"} Feb 17 13:45:21 crc kubenswrapper[4804]: I0217 13:45:21.619380 4804 generic.go:334] "Generic (PLEG): container finished" podID="45330d20-989c-4507-ae57-5beaee075484" containerID="217e77a7b262a2ea58a9d14b86a7ed1f48d810f819a1d8df9ec003eb84b66ae4" exitCode=0 Feb 17 13:45:21 crc kubenswrapper[4804]: I0217 13:45:21.619471 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-p4wrm" event={"ID":"45330d20-989c-4507-ae57-5beaee075484","Type":"ContainerDied","Data":"217e77a7b262a2ea58a9d14b86a7ed1f48d810f819a1d8df9ec003eb84b66ae4"} Feb 17 13:45:21 crc kubenswrapper[4804]: I0217 13:45:21.622389 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"49b02c8f-ff07-48f9-8012-e78dc6591499","Type":"ContainerStarted","Data":"d0a7f9b158d9b783df1a932e85ef37a9acfdf64570dc9811c07b05b77f46bfbf"} Feb 17 13:45:21 crc kubenswrapper[4804]: I0217 13:45:21.635845 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=23.526031984 podStartE2EDuration="28.635819816s" podCreationTimestamp="2026-02-17 13:44:53 +0000 UTC" firstStartedPulling="2026-02-17 13:45:14.35089735 +0000 UTC m=+1188.462316697" lastFinishedPulling="2026-02-17 13:45:19.460685192 +0000 UTC m=+1193.572104529" observedRunningTime="2026-02-17 13:45:21.631823951 +0000 UTC m=+1195.743243288" watchObservedRunningTime="2026-02-17 13:45:21.635819816 +0000 UTC m=+1195.747239153" Feb 17 13:45:21 crc kubenswrapper[4804]: I0217 13:45:21.657056 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-rzcfd" podStartSLOduration=16.943692872 podStartE2EDuration="22.657037191s" podCreationTimestamp="2026-02-17 13:44:59 +0000 UTC" firstStartedPulling="2026-02-17 13:45:14.56786633 +0000 UTC m=+1188.679285667" lastFinishedPulling="2026-02-17 13:45:20.281210649 +0000 UTC m=+1194.392629986" observedRunningTime="2026-02-17 13:45:21.654997887 +0000 UTC m=+1195.766417224" watchObservedRunningTime="2026-02-17 13:45:21.657037191 +0000 UTC m=+1195.768456528" Feb 17 13:45:21 crc kubenswrapper[4804]: I0217 13:45:21.682502 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=17.075520288 podStartE2EDuration="30.682476739s" podCreationTimestamp="2026-02-17 13:44:51 +0000 UTC" firstStartedPulling="2026-02-17 13:45:00.43640937 +0000 UTC m=+1174.547828707" lastFinishedPulling="2026-02-17 13:45:14.043365821 +0000 UTC m=+1188.154785158" observedRunningTime="2026-02-17 13:45:21.677155602 +0000 UTC m=+1195.788574939" watchObservedRunningTime="2026-02-17 13:45:21.682476739 +0000 UTC m=+1195.793896096" Feb 17 13:45:21 crc kubenswrapper[4804]: I0217 13:45:21.707120 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=29.70709537 podStartE2EDuration="29.70709537s" podCreationTimestamp="2026-02-17 13:44:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:45:21.701894467 +0000 UTC m=+1195.813313884" watchObservedRunningTime="2026-02-17 13:45:21.70709537 +0000 UTC m=+1195.818514707" Feb 17 13:45:21 crc kubenswrapper[4804]: I0217 13:45:21.770799 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=19.992601054 podStartE2EDuration="25.770767865s" podCreationTimestamp="2026-02-17 13:44:56 +0000 UTC" firstStartedPulling="2026-02-17 13:45:14.599471341 +0000 UTC m=+1188.710890678" lastFinishedPulling="2026-02-17 13:45:20.377638152 +0000 UTC m=+1194.489057489" observedRunningTime="2026-02-17 13:45:21.739074582 +0000 UTC m=+1195.850493919" watchObservedRunningTime="2026-02-17 13:45:21.770767865 +0000 UTC m=+1195.882187232" Feb 17 13:45:22 crc kubenswrapper[4804]: I0217 13:45:22.533347 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 17 13:45:22 crc kubenswrapper[4804]: I0217 13:45:22.534114 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 17 13:45:22 crc kubenswrapper[4804]: I0217 13:45:22.631782 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-p4wrm" event={"ID":"45330d20-989c-4507-ae57-5beaee075484","Type":"ContainerStarted","Data":"b657c439bdf2c279c1796f02b03ef98f0ccd6b8f5f26b36d733aa14612d348ec"} Feb 17 13:45:22 crc kubenswrapper[4804]: I0217 13:45:22.633162 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"10e1124a-f402-422d-a906-8d22c90d4abe","Type":"ContainerStarted","Data":"903c8a51b1a4cf89113253fd0c5b969fd2f642adf54120a08aabbe7c263b3f27"} Feb 17 13:45:22 crc kubenswrapper[4804]: I0217 13:45:22.639662 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0fc5c8da-b323-4afb-aa47-125fc63caefd","Type":"ContainerStarted","Data":"484e55da3fe17a14782d7443a2dbf3691be841c5bbaf448fe334e0feba2b7a89"} Feb 17 13:45:22 crc kubenswrapper[4804]: I0217 13:45:22.658403 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=13.489472895 podStartE2EDuration="20.658379065s" podCreationTimestamp="2026-02-17 13:45:02 +0000 UTC" firstStartedPulling="2026-02-17 13:45:15.033456273 +0000 UTC m=+1189.144875620" lastFinishedPulling="2026-02-17 13:45:22.202362433 +0000 UTC m=+1196.313781790" observedRunningTime="2026-02-17 13:45:22.64960622 +0000 UTC m=+1196.761025557" watchObservedRunningTime="2026-02-17 13:45:22.658379065 +0000 UTC m=+1196.769798402" Feb 17 13:45:22 crc kubenswrapper[4804]: I0217 13:45:22.680654 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=18.481921791 podStartE2EDuration="24.680634373s" podCreationTimestamp="2026-02-17 13:44:58 +0000 UTC" firstStartedPulling="2026-02-17 13:45:15.987878587 +0000 UTC m=+1190.099297924" lastFinishedPulling="2026-02-17 13:45:22.186591149 +0000 UTC m=+1196.298010506" observedRunningTime="2026-02-17 13:45:22.672124146 +0000 UTC m=+1196.783543493" watchObservedRunningTime="2026-02-17 13:45:22.680634373 +0000 UTC m=+1196.792053710" Feb 17 13:45:23 crc kubenswrapper[4804]: I0217 13:45:23.650481 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7705a06d-bc27-4686-9ca4-4aae248ead07","Type":"ContainerStarted","Data":"762e45a264eaebaf3d8f6d5ce0dcb58c8497b5e731fde55858df441e4a039993"} Feb 17 13:45:23 crc kubenswrapper[4804]: I0217 13:45:23.653020 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-p4wrm" event={"ID":"45330d20-989c-4507-ae57-5beaee075484","Type":"ContainerStarted","Data":"5ede0db05c38355b5ea63ac0452cc946d69281f7b2bd7401f4770c9f3a1bf045"} Feb 17 13:45:23 crc kubenswrapper[4804]: I0217 13:45:23.653528 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:45:23 crc kubenswrapper[4804]: I0217 13:45:23.653551 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:45:23 crc kubenswrapper[4804]: I0217 13:45:23.706404 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-p4wrm" podStartSLOduration=20.674907496 podStartE2EDuration="24.706382692s" podCreationTimestamp="2026-02-17 13:44:59 +0000 UTC" firstStartedPulling="2026-02-17 13:45:16.250498248 +0000 UTC m=+1190.361917585" lastFinishedPulling="2026-02-17 13:45:20.281973444 +0000 UTC m=+1194.393392781" observedRunningTime="2026-02-17 13:45:23.699493496 +0000 UTC m=+1197.810912843" watchObservedRunningTime="2026-02-17 13:45:23.706382692 +0000 UTC m=+1197.817802039" Feb 17 13:45:23 crc kubenswrapper[4804]: I0217 13:45:23.897661 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 17 13:45:23 crc kubenswrapper[4804]: I0217 13:45:23.944595 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 17 13:45:23 crc kubenswrapper[4804]: I0217 13:45:23.953651 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:24 crc kubenswrapper[4804]: I0217 13:45:24.038422 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 17 13:45:24 crc kubenswrapper[4804]: I0217 13:45:24.039265 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 17 13:45:24 crc kubenswrapper[4804]: I0217 13:45:24.664449 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 17 13:45:24 crc kubenswrapper[4804]: I0217 13:45:24.954364 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:25 crc kubenswrapper[4804]: I0217 13:45:25.004954 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:25 crc kubenswrapper[4804]: I0217 13:45:25.714012 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 17 13:45:25 crc kubenswrapper[4804]: I0217 13:45:25.715907 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 17 13:45:25 crc kubenswrapper[4804]: I0217 13:45:25.835161 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:45:25 crc kubenswrapper[4804]: I0217 13:45:25.835551 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.023406 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-kqvs6"] Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.102233 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-4s7l5"] Feb 17 13:45:26 crc kubenswrapper[4804]: E0217 13:45:26.102530 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f" containerName="collect-profiles" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.102544 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f" containerName="collect-profiles" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.102700 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f" containerName="collect-profiles" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.103215 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-4s7l5" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.106767 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.113149 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-c69gn"] Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.114626 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.119653 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.130919 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-c69gn"] Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.139131 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-4s7l5"] Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.193814 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csrsv\" (UniqueName: \"kubernetes.io/projected/50142f30-df04-4aa7-85e1-e303286966b7-kube-api-access-csrsv\") pod \"dnsmasq-dns-6bc7876d45-c69gn\" (UID: \"50142f30-df04-4aa7-85e1-e303286966b7\") " pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.193886 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d286aa08-b0df-44e8-9128-f596f4b44db8-combined-ca-bundle\") pod \"ovn-controller-metrics-4s7l5\" (UID: \"d286aa08-b0df-44e8-9128-f596f4b44db8\") " pod="openstack/ovn-controller-metrics-4s7l5" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.193925 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50142f30-df04-4aa7-85e1-e303286966b7-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-c69gn\" (UID: \"50142f30-df04-4aa7-85e1-e303286966b7\") " pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.193951 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50142f30-df04-4aa7-85e1-e303286966b7-config\") pod \"dnsmasq-dns-6bc7876d45-c69gn\" (UID: \"50142f30-df04-4aa7-85e1-e303286966b7\") " pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.193978 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d286aa08-b0df-44e8-9128-f596f4b44db8-ovn-rundir\") pod \"ovn-controller-metrics-4s7l5\" (UID: \"d286aa08-b0df-44e8-9128-f596f4b44db8\") " pod="openstack/ovn-controller-metrics-4s7l5" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.193999 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d286aa08-b0df-44e8-9128-f596f4b44db8-ovs-rundir\") pod \"ovn-controller-metrics-4s7l5\" (UID: \"d286aa08-b0df-44e8-9128-f596f4b44db8\") " pod="openstack/ovn-controller-metrics-4s7l5" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.194039 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d286aa08-b0df-44e8-9128-f596f4b44db8-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-4s7l5\" (UID: \"d286aa08-b0df-44e8-9128-f596f4b44db8\") " pod="openstack/ovn-controller-metrics-4s7l5" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.194082 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d286aa08-b0df-44e8-9128-f596f4b44db8-config\") pod \"ovn-controller-metrics-4s7l5\" (UID: \"d286aa08-b0df-44e8-9128-f596f4b44db8\") " pod="openstack/ovn-controller-metrics-4s7l5" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.194124 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50142f30-df04-4aa7-85e1-e303286966b7-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-c69gn\" (UID: \"50142f30-df04-4aa7-85e1-e303286966b7\") " pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.194233 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddfp4\" (UniqueName: \"kubernetes.io/projected/d286aa08-b0df-44e8-9128-f596f4b44db8-kube-api-access-ddfp4\") pod \"ovn-controller-metrics-4s7l5\" (UID: \"d286aa08-b0df-44e8-9128-f596f4b44db8\") " pod="openstack/ovn-controller-metrics-4s7l5" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.220622 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-vrzhp"] Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.248306 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.254779 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.264042 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.264281 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.271058 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-c6spm" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.271749 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.292127 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.295102 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csrsv\" (UniqueName: \"kubernetes.io/projected/50142f30-df04-4aa7-85e1-e303286966b7-kube-api-access-csrsv\") pod \"dnsmasq-dns-6bc7876d45-c69gn\" (UID: \"50142f30-df04-4aa7-85e1-e303286966b7\") " pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.295141 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d286aa08-b0df-44e8-9128-f596f4b44db8-combined-ca-bundle\") pod \"ovn-controller-metrics-4s7l5\" (UID: \"d286aa08-b0df-44e8-9128-f596f4b44db8\") " pod="openstack/ovn-controller-metrics-4s7l5" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.295165 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50142f30-df04-4aa7-85e1-e303286966b7-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-c69gn\" (UID: \"50142f30-df04-4aa7-85e1-e303286966b7\") " pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.295182 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50142f30-df04-4aa7-85e1-e303286966b7-config\") pod \"dnsmasq-dns-6bc7876d45-c69gn\" (UID: \"50142f30-df04-4aa7-85e1-e303286966b7\") " pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.295225 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d286aa08-b0df-44e8-9128-f596f4b44db8-ovn-rundir\") pod \"ovn-controller-metrics-4s7l5\" (UID: \"d286aa08-b0df-44e8-9128-f596f4b44db8\") " pod="openstack/ovn-controller-metrics-4s7l5" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.295243 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d286aa08-b0df-44e8-9128-f596f4b44db8-ovs-rundir\") pod \"ovn-controller-metrics-4s7l5\" (UID: \"d286aa08-b0df-44e8-9128-f596f4b44db8\") " pod="openstack/ovn-controller-metrics-4s7l5" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.295271 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d286aa08-b0df-44e8-9128-f596f4b44db8-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-4s7l5\" (UID: \"d286aa08-b0df-44e8-9128-f596f4b44db8\") " pod="openstack/ovn-controller-metrics-4s7l5" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.296980 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50142f30-df04-4aa7-85e1-e303286966b7-config\") pod \"dnsmasq-dns-6bc7876d45-c69gn\" (UID: \"50142f30-df04-4aa7-85e1-e303286966b7\") " pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.297983 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d286aa08-b0df-44e8-9128-f596f4b44db8-config\") pod \"ovn-controller-metrics-4s7l5\" (UID: \"d286aa08-b0df-44e8-9128-f596f4b44db8\") " pod="openstack/ovn-controller-metrics-4s7l5" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.298030 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50142f30-df04-4aa7-85e1-e303286966b7-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-c69gn\" (UID: \"50142f30-df04-4aa7-85e1-e303286966b7\") " pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.298115 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddfp4\" (UniqueName: \"kubernetes.io/projected/d286aa08-b0df-44e8-9128-f596f4b44db8-kube-api-access-ddfp4\") pod \"ovn-controller-metrics-4s7l5\" (UID: \"d286aa08-b0df-44e8-9128-f596f4b44db8\") " pod="openstack/ovn-controller-metrics-4s7l5" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.314722 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d286aa08-b0df-44e8-9128-f596f4b44db8-ovn-rundir\") pod \"ovn-controller-metrics-4s7l5\" (UID: \"d286aa08-b0df-44e8-9128-f596f4b44db8\") " pod="openstack/ovn-controller-metrics-4s7l5" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.316838 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d286aa08-b0df-44e8-9128-f596f4b44db8-ovs-rundir\") pod \"ovn-controller-metrics-4s7l5\" (UID: \"d286aa08-b0df-44e8-9128-f596f4b44db8\") " pod="openstack/ovn-controller-metrics-4s7l5" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.340357 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50142f30-df04-4aa7-85e1-e303286966b7-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-c69gn\" (UID: \"50142f30-df04-4aa7-85e1-e303286966b7\") " pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.341269 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50142f30-df04-4aa7-85e1-e303286966b7-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-c69gn\" (UID: \"50142f30-df04-4aa7-85e1-e303286966b7\") " pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.350087 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-s5nsf"] Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.350159 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d286aa08-b0df-44e8-9128-f596f4b44db8-combined-ca-bundle\") pod \"ovn-controller-metrics-4s7l5\" (UID: \"d286aa08-b0df-44e8-9128-f596f4b44db8\") " pod="openstack/ovn-controller-metrics-4s7l5" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.350543 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d286aa08-b0df-44e8-9128-f596f4b44db8-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-4s7l5\" (UID: \"d286aa08-b0df-44e8-9128-f596f4b44db8\") " pod="openstack/ovn-controller-metrics-4s7l5" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.350881 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d286aa08-b0df-44e8-9128-f596f4b44db8-config\") pod \"ovn-controller-metrics-4s7l5\" (UID: \"d286aa08-b0df-44e8-9128-f596f4b44db8\") " pod="openstack/ovn-controller-metrics-4s7l5" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.351782 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.357868 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddfp4\" (UniqueName: \"kubernetes.io/projected/d286aa08-b0df-44e8-9128-f596f4b44db8-kube-api-access-ddfp4\") pod \"ovn-controller-metrics-4s7l5\" (UID: \"d286aa08-b0df-44e8-9128-f596f4b44db8\") " pod="openstack/ovn-controller-metrics-4s7l5" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.366606 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.383030 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csrsv\" (UniqueName: \"kubernetes.io/projected/50142f30-df04-4aa7-85e1-e303286966b7-kube-api-access-csrsv\") pod \"dnsmasq-dns-6bc7876d45-c69gn\" (UID: \"50142f30-df04-4aa7-85e1-e303286966b7\") " pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.403833 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f72mb\" (UniqueName: \"kubernetes.io/projected/3e322ccb-33cf-466f-91fb-63781bdcffb6-kube-api-access-f72mb\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.403920 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e322ccb-33cf-466f-91fb-63781bdcffb6-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.404000 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3e322ccb-33cf-466f-91fb-63781bdcffb6-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.404031 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e322ccb-33cf-466f-91fb-63781bdcffb6-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.404141 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e322ccb-33cf-466f-91fb-63781bdcffb6-scripts\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.404160 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e322ccb-33cf-466f-91fb-63781bdcffb6-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.404316 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e322ccb-33cf-466f-91fb-63781bdcffb6-config\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.430269 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-s5nsf"] Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.460279 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-4s7l5" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.478068 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.506676 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e322ccb-33cf-466f-91fb-63781bdcffb6-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.506757 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3e322ccb-33cf-466f-91fb-63781bdcffb6-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.506790 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e322ccb-33cf-466f-91fb-63781bdcffb6-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.506835 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-config\") pod \"dnsmasq-dns-8554648995-s5nsf\" (UID: \"f2e35955-0967-4a9c-b4e5-68316c98d58f\") " pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.506874 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e322ccb-33cf-466f-91fb-63781bdcffb6-scripts\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.506903 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e322ccb-33cf-466f-91fb-63781bdcffb6-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.506944 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7ndx\" (UniqueName: \"kubernetes.io/projected/f2e35955-0967-4a9c-b4e5-68316c98d58f-kube-api-access-n7ndx\") pod \"dnsmasq-dns-8554648995-s5nsf\" (UID: \"f2e35955-0967-4a9c-b4e5-68316c98d58f\") " pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.506977 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-s5nsf\" (UID: \"f2e35955-0967-4a9c-b4e5-68316c98d58f\") " pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.507063 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e322ccb-33cf-466f-91fb-63781bdcffb6-config\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.507106 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-dns-svc\") pod \"dnsmasq-dns-8554648995-s5nsf\" (UID: \"f2e35955-0967-4a9c-b4e5-68316c98d58f\") " pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.507154 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-s5nsf\" (UID: \"f2e35955-0967-4a9c-b4e5-68316c98d58f\") " pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.507186 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f72mb\" (UniqueName: \"kubernetes.io/projected/3e322ccb-33cf-466f-91fb-63781bdcffb6-kube-api-access-f72mb\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.511808 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3e322ccb-33cf-466f-91fb-63781bdcffb6-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.512815 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e322ccb-33cf-466f-91fb-63781bdcffb6-scripts\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.513604 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e322ccb-33cf-466f-91fb-63781bdcffb6-config\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.514525 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e322ccb-33cf-466f-91fb-63781bdcffb6-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.526380 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.530506 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e322ccb-33cf-466f-91fb-63781bdcffb6-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.542136 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e322ccb-33cf-466f-91fb-63781bdcffb6-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.561031 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f72mb\" (UniqueName: \"kubernetes.io/projected/3e322ccb-33cf-466f-91fb-63781bdcffb6-kube-api-access-f72mb\") pod \"ovn-northd-0\" (UID: \"3e322ccb-33cf-466f-91fb-63781bdcffb6\") " pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.609013 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-dns-svc\") pod \"dnsmasq-dns-8554648995-s5nsf\" (UID: \"f2e35955-0967-4a9c-b4e5-68316c98d58f\") " pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.609083 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-s5nsf\" (UID: \"f2e35955-0967-4a9c-b4e5-68316c98d58f\") " pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.609141 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-config\") pod \"dnsmasq-dns-8554648995-s5nsf\" (UID: \"f2e35955-0967-4a9c-b4e5-68316c98d58f\") " pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.609176 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7ndx\" (UniqueName: \"kubernetes.io/projected/f2e35955-0967-4a9c-b4e5-68316c98d58f-kube-api-access-n7ndx\") pod \"dnsmasq-dns-8554648995-s5nsf\" (UID: \"f2e35955-0967-4a9c-b4e5-68316c98d58f\") " pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.609218 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-s5nsf\" (UID: \"f2e35955-0967-4a9c-b4e5-68316c98d58f\") " pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.612882 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-s5nsf\" (UID: \"f2e35955-0967-4a9c-b4e5-68316c98d58f\") " pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.613053 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-dns-svc\") pod \"dnsmasq-dns-8554648995-s5nsf\" (UID: \"f2e35955-0967-4a9c-b4e5-68316c98d58f\") " pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.613271 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.613399 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-config\") pod \"dnsmasq-dns-8554648995-s5nsf\" (UID: \"f2e35955-0967-4a9c-b4e5-68316c98d58f\") " pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.665386 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-c6spm" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.674120 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7ndx\" (UniqueName: \"kubernetes.io/projected/f2e35955-0967-4a9c-b4e5-68316c98d58f-kube-api-access-n7ndx\") pod \"dnsmasq-dns-8554648995-s5nsf\" (UID: \"f2e35955-0967-4a9c-b4e5-68316c98d58f\") " pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.674669 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-s5nsf\" (UID: \"f2e35955-0967-4a9c-b4e5-68316c98d58f\") " pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.679707 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.784555 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-kqvs6" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.872149 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.917473 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3586301a-dce2-427b-b5c4-9376e59fbf27-config\") pod \"3586301a-dce2-427b-b5c4-9376e59fbf27\" (UID: \"3586301a-dce2-427b-b5c4-9376e59fbf27\") " Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.917541 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zx6l\" (UniqueName: \"kubernetes.io/projected/3586301a-dce2-427b-b5c4-9376e59fbf27-kube-api-access-9zx6l\") pod \"3586301a-dce2-427b-b5c4-9376e59fbf27\" (UID: \"3586301a-dce2-427b-b5c4-9376e59fbf27\") " Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.917614 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3586301a-dce2-427b-b5c4-9376e59fbf27-dns-svc\") pod \"3586301a-dce2-427b-b5c4-9376e59fbf27\" (UID: \"3586301a-dce2-427b-b5c4-9376e59fbf27\") " Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.919063 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3586301a-dce2-427b-b5c4-9376e59fbf27-config" (OuterVolumeSpecName: "config") pod "3586301a-dce2-427b-b5c4-9376e59fbf27" (UID: "3586301a-dce2-427b-b5c4-9376e59fbf27"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.920440 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3586301a-dce2-427b-b5c4-9376e59fbf27-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3586301a-dce2-427b-b5c4-9376e59fbf27" (UID: "3586301a-dce2-427b-b5c4-9376e59fbf27"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.938532 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3586301a-dce2-427b-b5c4-9376e59fbf27-kube-api-access-9zx6l" (OuterVolumeSpecName: "kube-api-access-9zx6l") pod "3586301a-dce2-427b-b5c4-9376e59fbf27" (UID: "3586301a-dce2-427b-b5c4-9376e59fbf27"). InnerVolumeSpecName "kube-api-access-9zx6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:45:26 crc kubenswrapper[4804]: I0217 13:45:26.987565 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-vrzhp" Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.020051 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3586301a-dce2-427b-b5c4-9376e59fbf27-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.020091 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zx6l\" (UniqueName: \"kubernetes.io/projected/3586301a-dce2-427b-b5c4-9376e59fbf27-kube-api-access-9zx6l\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.020105 4804 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3586301a-dce2-427b-b5c4-9376e59fbf27-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.121740 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kv9pl\" (UniqueName: \"kubernetes.io/projected/8175f453-b68b-4236-844d-ff723515fe63-kube-api-access-kv9pl\") pod \"8175f453-b68b-4236-844d-ff723515fe63\" (UID: \"8175f453-b68b-4236-844d-ff723515fe63\") " Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.122352 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8175f453-b68b-4236-844d-ff723515fe63-dns-svc\") pod \"8175f453-b68b-4236-844d-ff723515fe63\" (UID: \"8175f453-b68b-4236-844d-ff723515fe63\") " Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.122545 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8175f453-b68b-4236-844d-ff723515fe63-config\") pod \"8175f453-b68b-4236-844d-ff723515fe63\" (UID: \"8175f453-b68b-4236-844d-ff723515fe63\") " Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.123869 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8175f453-b68b-4236-844d-ff723515fe63-config" (OuterVolumeSpecName: "config") pod "8175f453-b68b-4236-844d-ff723515fe63" (UID: "8175f453-b68b-4236-844d-ff723515fe63"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.124571 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8175f453-b68b-4236-844d-ff723515fe63-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8175f453-b68b-4236-844d-ff723515fe63" (UID: "8175f453-b68b-4236-844d-ff723515fe63"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.127423 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8175f453-b68b-4236-844d-ff723515fe63-kube-api-access-kv9pl" (OuterVolumeSpecName: "kube-api-access-kv9pl") pod "8175f453-b68b-4236-844d-ff723515fe63" (UID: "8175f453-b68b-4236-844d-ff723515fe63"). InnerVolumeSpecName "kube-api-access-kv9pl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.229226 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8175f453-b68b-4236-844d-ff723515fe63-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.229280 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kv9pl\" (UniqueName: \"kubernetes.io/projected/8175f453-b68b-4236-844d-ff723515fe63-kube-api-access-kv9pl\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.229295 4804 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8175f453-b68b-4236-844d-ff723515fe63-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.233336 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-4s7l5"] Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.241182 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-c69gn"] Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.391305 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 17 13:45:27 crc kubenswrapper[4804]: W0217 13:45:27.408532 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e322ccb_33cf_466f_91fb_63781bdcffb6.slice/crio-26c9b6efe4276a413a683c57c2631ff7e090f2ebdf021b74ccfc04cc96b357ad WatchSource:0}: Error finding container 26c9b6efe4276a413a683c57c2631ff7e090f2ebdf021b74ccfc04cc96b357ad: Status 404 returned error can't find the container with id 26c9b6efe4276a413a683c57c2631ff7e090f2ebdf021b74ccfc04cc96b357ad Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.445012 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-s5nsf"] Feb 17 13:45:27 crc kubenswrapper[4804]: W0217 13:45:27.448061 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2e35955_0967_4a9c_b4e5_68316c98d58f.slice/crio-2fd594aa91077237ced828ad76ea96d9e73ca61204c2b63332a894fe7e26b921 WatchSource:0}: Error finding container 2fd594aa91077237ced828ad76ea96d9e73ca61204c2b63332a894fe7e26b921: Status 404 returned error can't find the container with id 2fd594aa91077237ced828ad76ea96d9e73ca61204c2b63332a894fe7e26b921 Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.730022 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-s5nsf" event={"ID":"f2e35955-0967-4a9c-b4e5-68316c98d58f","Type":"ContainerStarted","Data":"2fd594aa91077237ced828ad76ea96d9e73ca61204c2b63332a894fe7e26b921"} Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.734853 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-4s7l5" event={"ID":"d286aa08-b0df-44e8-9128-f596f4b44db8","Type":"ContainerStarted","Data":"5a253bb1e30f5407483358daadb8d300069de0d9a15599231de8d80c785f568b"} Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.734925 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-4s7l5" event={"ID":"d286aa08-b0df-44e8-9128-f596f4b44db8","Type":"ContainerStarted","Data":"421745ebbe6511f48fac6a70851b5f11760a42f7a8de727170ebbb740043a13e"} Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.734941 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-kqvs6" event={"ID":"3586301a-dce2-427b-b5c4-9376e59fbf27","Type":"ContainerDied","Data":"b6acb0860f5dd58b1333ac392aa371b170675172cb3eb7dbaaabc60cbdae0d1e"} Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.734962 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-vrzhp" event={"ID":"8175f453-b68b-4236-844d-ff723515fe63","Type":"ContainerDied","Data":"6dec93dab248c776ff8091a9233f8da9e53443d47dfd060ebb89371b1dc81611"} Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.736096 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" event={"ID":"50142f30-df04-4aa7-85e1-e303286966b7","Type":"ContainerStarted","Data":"beea4edb9ab8a4290234d33096c41b7ec1bcf83e4fedb843a0fee43bc42ec3a0"} Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.739484 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-vrzhp" Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.741268 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3e322ccb-33cf-466f-91fb-63781bdcffb6","Type":"ContainerStarted","Data":"26c9b6efe4276a413a683c57c2631ff7e090f2ebdf021b74ccfc04cc96b357ad"} Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.741370 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-kqvs6" Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.768127 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-4s7l5" podStartSLOduration=1.768098736 podStartE2EDuration="1.768098736s" podCreationTimestamp="2026-02-17 13:45:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:45:27.759282639 +0000 UTC m=+1201.870701976" watchObservedRunningTime="2026-02-17 13:45:27.768098736 +0000 UTC m=+1201.879518073" Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.880883 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-kqvs6"] Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.886777 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-kqvs6"] Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.917532 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-vrzhp"] Feb 17 13:45:27 crc kubenswrapper[4804]: I0217 13:45:27.927452 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-vrzhp"] Feb 17 13:45:28 crc kubenswrapper[4804]: I0217 13:45:28.195460 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 17 13:45:28 crc kubenswrapper[4804]: I0217 13:45:28.276437 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 17 13:45:28 crc kubenswrapper[4804]: I0217 13:45:28.585643 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3586301a-dce2-427b-b5c4-9376e59fbf27" path="/var/lib/kubelet/pods/3586301a-dce2-427b-b5c4-9376e59fbf27/volumes" Feb 17 13:45:28 crc kubenswrapper[4804]: I0217 13:45:28.586769 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8175f453-b68b-4236-844d-ff723515fe63" path="/var/lib/kubelet/pods/8175f453-b68b-4236-844d-ff723515fe63/volumes" Feb 17 13:45:28 crc kubenswrapper[4804]: I0217 13:45:28.751051 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3e322ccb-33cf-466f-91fb-63781bdcffb6","Type":"ContainerStarted","Data":"2ea9196e59af7aa250cd15085ebe409c06fd1598d66d60e3f77b06e040e8e13b"} Feb 17 13:45:28 crc kubenswrapper[4804]: I0217 13:45:28.753139 4804 generic.go:334] "Generic (PLEG): container finished" podID="f2e35955-0967-4a9c-b4e5-68316c98d58f" containerID="84e8ddfc4492066363704b53f96c7ec893c142c41860e555ddf4bd60f09fd51e" exitCode=0 Feb 17 13:45:28 crc kubenswrapper[4804]: I0217 13:45:28.753271 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-s5nsf" event={"ID":"f2e35955-0967-4a9c-b4e5-68316c98d58f","Type":"ContainerDied","Data":"84e8ddfc4492066363704b53f96c7ec893c142c41860e555ddf4bd60f09fd51e"} Feb 17 13:45:28 crc kubenswrapper[4804]: I0217 13:45:28.754822 4804 generic.go:334] "Generic (PLEG): container finished" podID="50142f30-df04-4aa7-85e1-e303286966b7" containerID="ab28572c13b5040bf4ce2b36ef6ec484d61c5f798e505ab5c93281f67d3def85" exitCode=0 Feb 17 13:45:28 crc kubenswrapper[4804]: I0217 13:45:28.754932 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" event={"ID":"50142f30-df04-4aa7-85e1-e303286966b7","Type":"ContainerDied","Data":"ab28572c13b5040bf4ce2b36ef6ec484d61c5f798e505ab5c93281f67d3def85"} Feb 17 13:45:29 crc kubenswrapper[4804]: I0217 13:45:29.082073 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 17 13:45:29 crc kubenswrapper[4804]: I0217 13:45:29.774807 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3e322ccb-33cf-466f-91fb-63781bdcffb6","Type":"ContainerStarted","Data":"59c5735b217f50dec368f226d655e3f5011a449fe03afeb3ec36351d968b71bf"} Feb 17 13:45:29 crc kubenswrapper[4804]: I0217 13:45:29.775172 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 17 13:45:29 crc kubenswrapper[4804]: I0217 13:45:29.779085 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-s5nsf" event={"ID":"f2e35955-0967-4a9c-b4e5-68316c98d58f","Type":"ContainerStarted","Data":"4e8269b54fecdae67dd01feae8090bcedbeb3d750c8ec050f707a3d55a90d7c2"} Feb 17 13:45:29 crc kubenswrapper[4804]: I0217 13:45:29.779757 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:29 crc kubenswrapper[4804]: I0217 13:45:29.785830 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" event={"ID":"50142f30-df04-4aa7-85e1-e303286966b7","Type":"ContainerStarted","Data":"e01cd3f75d3aef51eac8d30928a4014bf5777a2284faed2dc4b55788bbefaed0"} Feb 17 13:45:29 crc kubenswrapper[4804]: I0217 13:45:29.786763 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" Feb 17 13:45:29 crc kubenswrapper[4804]: I0217 13:45:29.804779 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.760325873 podStartE2EDuration="3.804757638s" podCreationTimestamp="2026-02-17 13:45:26 +0000 UTC" firstStartedPulling="2026-02-17 13:45:27.411962273 +0000 UTC m=+1201.523381610" lastFinishedPulling="2026-02-17 13:45:28.456394048 +0000 UTC m=+1202.567813375" observedRunningTime="2026-02-17 13:45:29.801620921 +0000 UTC m=+1203.913040268" watchObservedRunningTime="2026-02-17 13:45:29.804757638 +0000 UTC m=+1203.916176975" Feb 17 13:45:29 crc kubenswrapper[4804]: I0217 13:45:29.819990 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" podStartSLOduration=3.384035783 podStartE2EDuration="3.819966746s" podCreationTimestamp="2026-02-17 13:45:26 +0000 UTC" firstStartedPulling="2026-02-17 13:45:27.249940955 +0000 UTC m=+1201.361360292" lastFinishedPulling="2026-02-17 13:45:27.685871918 +0000 UTC m=+1201.797291255" observedRunningTime="2026-02-17 13:45:29.817356344 +0000 UTC m=+1203.928775711" watchObservedRunningTime="2026-02-17 13:45:29.819966746 +0000 UTC m=+1203.931386083" Feb 17 13:45:29 crc kubenswrapper[4804]: I0217 13:45:29.841514 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-s5nsf" podStartSLOduration=3.418349279 podStartE2EDuration="3.84148699s" podCreationTimestamp="2026-02-17 13:45:26 +0000 UTC" firstStartedPulling="2026-02-17 13:45:27.452608748 +0000 UTC m=+1201.564028085" lastFinishedPulling="2026-02-17 13:45:27.875746459 +0000 UTC m=+1201.987165796" observedRunningTime="2026-02-17 13:45:29.834083628 +0000 UTC m=+1203.945502985" watchObservedRunningTime="2026-02-17 13:45:29.84148699 +0000 UTC m=+1203.952906327" Feb 17 13:45:30 crc kubenswrapper[4804]: I0217 13:45:30.705532 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 17 13:45:30 crc kubenswrapper[4804]: I0217 13:45:30.802895 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 17 13:45:31 crc kubenswrapper[4804]: I0217 13:45:31.284258 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-hbdkd"] Feb 17 13:45:31 crc kubenswrapper[4804]: I0217 13:45:31.285755 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hbdkd" Feb 17 13:45:31 crc kubenswrapper[4804]: I0217 13:45:31.290759 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-hbdkd"] Feb 17 13:45:31 crc kubenswrapper[4804]: I0217 13:45:31.294925 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 17 13:45:31 crc kubenswrapper[4804]: I0217 13:45:31.313107 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fb7k\" (UniqueName: \"kubernetes.io/projected/ed593eeb-17c6-42cc-a392-9fbb1f3aef6e-kube-api-access-5fb7k\") pod \"root-account-create-update-hbdkd\" (UID: \"ed593eeb-17c6-42cc-a392-9fbb1f3aef6e\") " pod="openstack/root-account-create-update-hbdkd" Feb 17 13:45:31 crc kubenswrapper[4804]: I0217 13:45:31.313165 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed593eeb-17c6-42cc-a392-9fbb1f3aef6e-operator-scripts\") pod \"root-account-create-update-hbdkd\" (UID: \"ed593eeb-17c6-42cc-a392-9fbb1f3aef6e\") " pod="openstack/root-account-create-update-hbdkd" Feb 17 13:45:31 crc kubenswrapper[4804]: I0217 13:45:31.414283 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fb7k\" (UniqueName: \"kubernetes.io/projected/ed593eeb-17c6-42cc-a392-9fbb1f3aef6e-kube-api-access-5fb7k\") pod \"root-account-create-update-hbdkd\" (UID: \"ed593eeb-17c6-42cc-a392-9fbb1f3aef6e\") " pod="openstack/root-account-create-update-hbdkd" Feb 17 13:45:31 crc kubenswrapper[4804]: I0217 13:45:31.414349 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed593eeb-17c6-42cc-a392-9fbb1f3aef6e-operator-scripts\") pod \"root-account-create-update-hbdkd\" (UID: \"ed593eeb-17c6-42cc-a392-9fbb1f3aef6e\") " pod="openstack/root-account-create-update-hbdkd" Feb 17 13:45:31 crc kubenswrapper[4804]: I0217 13:45:31.415142 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed593eeb-17c6-42cc-a392-9fbb1f3aef6e-operator-scripts\") pod \"root-account-create-update-hbdkd\" (UID: \"ed593eeb-17c6-42cc-a392-9fbb1f3aef6e\") " pod="openstack/root-account-create-update-hbdkd" Feb 17 13:45:31 crc kubenswrapper[4804]: I0217 13:45:31.435739 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fb7k\" (UniqueName: \"kubernetes.io/projected/ed593eeb-17c6-42cc-a392-9fbb1f3aef6e-kube-api-access-5fb7k\") pod \"root-account-create-update-hbdkd\" (UID: \"ed593eeb-17c6-42cc-a392-9fbb1f3aef6e\") " pod="openstack/root-account-create-update-hbdkd" Feb 17 13:45:31 crc kubenswrapper[4804]: I0217 13:45:31.615402 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hbdkd" Feb 17 13:45:32 crc kubenswrapper[4804]: I0217 13:45:32.117513 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-hbdkd"] Feb 17 13:45:32 crc kubenswrapper[4804]: W0217 13:45:32.125461 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded593eeb_17c6_42cc_a392_9fbb1f3aef6e.slice/crio-0e526352fe86de53d110e5db802aba66020e9334cff6f240f845e8f5d42b814d WatchSource:0}: Error finding container 0e526352fe86de53d110e5db802aba66020e9334cff6f240f845e8f5d42b814d: Status 404 returned error can't find the container with id 0e526352fe86de53d110e5db802aba66020e9334cff6f240f845e8f5d42b814d Feb 17 13:45:32 crc kubenswrapper[4804]: I0217 13:45:32.810447 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hbdkd" event={"ID":"ed593eeb-17c6-42cc-a392-9fbb1f3aef6e","Type":"ContainerStarted","Data":"0e526352fe86de53d110e5db802aba66020e9334cff6f240f845e8f5d42b814d"} Feb 17 13:45:34 crc kubenswrapper[4804]: I0217 13:45:34.833892 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hbdkd" event={"ID":"ed593eeb-17c6-42cc-a392-9fbb1f3aef6e","Type":"ContainerStarted","Data":"b7047f4fb5cc51bf92eedf0304d4d9a035247692d44844e1d6c89de23d58aef4"} Feb 17 13:45:34 crc kubenswrapper[4804]: I0217 13:45:34.857410 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-hbdkd" podStartSLOduration=3.85738542 podStartE2EDuration="3.85738542s" podCreationTimestamp="2026-02-17 13:45:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:45:34.850474303 +0000 UTC m=+1208.961893660" watchObservedRunningTime="2026-02-17 13:45:34.85738542 +0000 UTC m=+1208.968804777" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.135530 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-dl5b9"] Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.136627 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dl5b9" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.165898 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-dl5b9"] Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.190295 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qsxw\" (UniqueName: \"kubernetes.io/projected/4bc37bd5-6784-41f8-98de-ef6a43493cd6-kube-api-access-8qsxw\") pod \"keystone-db-create-dl5b9\" (UID: \"4bc37bd5-6784-41f8-98de-ef6a43493cd6\") " pod="openstack/keystone-db-create-dl5b9" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.190436 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4bc37bd5-6784-41f8-98de-ef6a43493cd6-operator-scripts\") pod \"keystone-db-create-dl5b9\" (UID: \"4bc37bd5-6784-41f8-98de-ef6a43493cd6\") " pod="openstack/keystone-db-create-dl5b9" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.232612 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-886b-account-create-update-h84mx"] Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.233737 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-886b-account-create-update-h84mx" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.235692 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.242152 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-886b-account-create-update-h84mx"] Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.292397 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qsxw\" (UniqueName: \"kubernetes.io/projected/4bc37bd5-6784-41f8-98de-ef6a43493cd6-kube-api-access-8qsxw\") pod \"keystone-db-create-dl5b9\" (UID: \"4bc37bd5-6784-41f8-98de-ef6a43493cd6\") " pod="openstack/keystone-db-create-dl5b9" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.292459 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f9dbe9b-ced6-453d-9f59-0d92e2a69043-operator-scripts\") pod \"keystone-886b-account-create-update-h84mx\" (UID: \"6f9dbe9b-ced6-453d-9f59-0d92e2a69043\") " pod="openstack/keystone-886b-account-create-update-h84mx" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.292497 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4bc37bd5-6784-41f8-98de-ef6a43493cd6-operator-scripts\") pod \"keystone-db-create-dl5b9\" (UID: \"4bc37bd5-6784-41f8-98de-ef6a43493cd6\") " pod="openstack/keystone-db-create-dl5b9" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.292855 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mdgv\" (UniqueName: \"kubernetes.io/projected/6f9dbe9b-ced6-453d-9f59-0d92e2a69043-kube-api-access-5mdgv\") pod \"keystone-886b-account-create-update-h84mx\" (UID: \"6f9dbe9b-ced6-453d-9f59-0d92e2a69043\") " pod="openstack/keystone-886b-account-create-update-h84mx" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.293245 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4bc37bd5-6784-41f8-98de-ef6a43493cd6-operator-scripts\") pod \"keystone-db-create-dl5b9\" (UID: \"4bc37bd5-6784-41f8-98de-ef6a43493cd6\") " pod="openstack/keystone-db-create-dl5b9" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.311886 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qsxw\" (UniqueName: \"kubernetes.io/projected/4bc37bd5-6784-41f8-98de-ef6a43493cd6-kube-api-access-8qsxw\") pod \"keystone-db-create-dl5b9\" (UID: \"4bc37bd5-6784-41f8-98de-ef6a43493cd6\") " pod="openstack/keystone-db-create-dl5b9" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.384583 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-6m6pk"] Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.385983 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6m6pk" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.391774 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-6m6pk"] Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.394893 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mdgv\" (UniqueName: \"kubernetes.io/projected/6f9dbe9b-ced6-453d-9f59-0d92e2a69043-kube-api-access-5mdgv\") pod \"keystone-886b-account-create-update-h84mx\" (UID: \"6f9dbe9b-ced6-453d-9f59-0d92e2a69043\") " pod="openstack/keystone-886b-account-create-update-h84mx" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.394990 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f9dbe9b-ced6-453d-9f59-0d92e2a69043-operator-scripts\") pod \"keystone-886b-account-create-update-h84mx\" (UID: \"6f9dbe9b-ced6-453d-9f59-0d92e2a69043\") " pod="openstack/keystone-886b-account-create-update-h84mx" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.396410 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f9dbe9b-ced6-453d-9f59-0d92e2a69043-operator-scripts\") pod \"keystone-886b-account-create-update-h84mx\" (UID: \"6f9dbe9b-ced6-453d-9f59-0d92e2a69043\") " pod="openstack/keystone-886b-account-create-update-h84mx" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.419736 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mdgv\" (UniqueName: \"kubernetes.io/projected/6f9dbe9b-ced6-453d-9f59-0d92e2a69043-kube-api-access-5mdgv\") pod \"keystone-886b-account-create-update-h84mx\" (UID: \"6f9dbe9b-ced6-453d-9f59-0d92e2a69043\") " pod="openstack/keystone-886b-account-create-update-h84mx" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.468851 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dl5b9" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.496294 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2edd89a7-0866-4677-8b25-9654130c6ac5-operator-scripts\") pod \"placement-db-create-6m6pk\" (UID: \"2edd89a7-0866-4677-8b25-9654130c6ac5\") " pod="openstack/placement-db-create-6m6pk" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.496472 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqldx\" (UniqueName: \"kubernetes.io/projected/2edd89a7-0866-4677-8b25-9654130c6ac5-kube-api-access-nqldx\") pod \"placement-db-create-6m6pk\" (UID: \"2edd89a7-0866-4677-8b25-9654130c6ac5\") " pod="openstack/placement-db-create-6m6pk" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.539057 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-0898-account-create-update-6vpd7"] Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.540457 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0898-account-create-update-6vpd7" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.542513 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.547378 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-0898-account-create-update-6vpd7"] Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.548965 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-886b-account-create-update-h84mx" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.597709 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2edd89a7-0866-4677-8b25-9654130c6ac5-operator-scripts\") pod \"placement-db-create-6m6pk\" (UID: \"2edd89a7-0866-4677-8b25-9654130c6ac5\") " pod="openstack/placement-db-create-6m6pk" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.598100 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba7e6539-c0c9-40e7-b076-38cc23f233cc-operator-scripts\") pod \"placement-0898-account-create-update-6vpd7\" (UID: \"ba7e6539-c0c9-40e7-b076-38cc23f233cc\") " pod="openstack/placement-0898-account-create-update-6vpd7" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.598241 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rvmk\" (UniqueName: \"kubernetes.io/projected/ba7e6539-c0c9-40e7-b076-38cc23f233cc-kube-api-access-6rvmk\") pod \"placement-0898-account-create-update-6vpd7\" (UID: \"ba7e6539-c0c9-40e7-b076-38cc23f233cc\") " pod="openstack/placement-0898-account-create-update-6vpd7" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.598307 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqldx\" (UniqueName: \"kubernetes.io/projected/2edd89a7-0866-4677-8b25-9654130c6ac5-kube-api-access-nqldx\") pod \"placement-db-create-6m6pk\" (UID: \"2edd89a7-0866-4677-8b25-9654130c6ac5\") " pod="openstack/placement-db-create-6m6pk" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.598663 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2edd89a7-0866-4677-8b25-9654130c6ac5-operator-scripts\") pod \"placement-db-create-6m6pk\" (UID: \"2edd89a7-0866-4677-8b25-9654130c6ac5\") " pod="openstack/placement-db-create-6m6pk" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.620972 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqldx\" (UniqueName: \"kubernetes.io/projected/2edd89a7-0866-4677-8b25-9654130c6ac5-kube-api-access-nqldx\") pod \"placement-db-create-6m6pk\" (UID: \"2edd89a7-0866-4677-8b25-9654130c6ac5\") " pod="openstack/placement-db-create-6m6pk" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.699514 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rvmk\" (UniqueName: \"kubernetes.io/projected/ba7e6539-c0c9-40e7-b076-38cc23f233cc-kube-api-access-6rvmk\") pod \"placement-0898-account-create-update-6vpd7\" (UID: \"ba7e6539-c0c9-40e7-b076-38cc23f233cc\") " pod="openstack/placement-0898-account-create-update-6vpd7" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.699691 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba7e6539-c0c9-40e7-b076-38cc23f233cc-operator-scripts\") pod \"placement-0898-account-create-update-6vpd7\" (UID: \"ba7e6539-c0c9-40e7-b076-38cc23f233cc\") " pod="openstack/placement-0898-account-create-update-6vpd7" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.700728 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba7e6539-c0c9-40e7-b076-38cc23f233cc-operator-scripts\") pod \"placement-0898-account-create-update-6vpd7\" (UID: \"ba7e6539-c0c9-40e7-b076-38cc23f233cc\") " pod="openstack/placement-0898-account-create-update-6vpd7" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.707661 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6m6pk" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.722171 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rvmk\" (UniqueName: \"kubernetes.io/projected/ba7e6539-c0c9-40e7-b076-38cc23f233cc-kube-api-access-6rvmk\") pod \"placement-0898-account-create-update-6vpd7\" (UID: \"ba7e6539-c0c9-40e7-b076-38cc23f233cc\") " pod="openstack/placement-0898-account-create-update-6vpd7" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.844963 4804 generic.go:334] "Generic (PLEG): container finished" podID="ed593eeb-17c6-42cc-a392-9fbb1f3aef6e" containerID="b7047f4fb5cc51bf92eedf0304d4d9a035247692d44844e1d6c89de23d58aef4" exitCode=0 Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.845027 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hbdkd" event={"ID":"ed593eeb-17c6-42cc-a392-9fbb1f3aef6e","Type":"ContainerDied","Data":"b7047f4fb5cc51bf92eedf0304d4d9a035247692d44844e1d6c89de23d58aef4"} Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.935886 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0898-account-create-update-6vpd7" Feb 17 13:45:35 crc kubenswrapper[4804]: I0217 13:45:35.966934 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-dl5b9"] Feb 17 13:45:35 crc kubenswrapper[4804]: W0217 13:45:35.976632 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4bc37bd5_6784_41f8_98de_ef6a43493cd6.slice/crio-f20e76ded39c36574ceb45bd27901b8c625cfc377c6fbf0205827cf374bd32b0 WatchSource:0}: Error finding container f20e76ded39c36574ceb45bd27901b8c625cfc377c6fbf0205827cf374bd32b0: Status 404 returned error can't find the container with id f20e76ded39c36574ceb45bd27901b8c625cfc377c6fbf0205827cf374bd32b0 Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.064747 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-886b-account-create-update-h84mx"] Feb 17 13:45:36 crc kubenswrapper[4804]: W0217 13:45:36.077057 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f9dbe9b_ced6_453d_9f59_0d92e2a69043.slice/crio-f5bcec3b699cdf0f9f0db04a7fa76d46cf5931f896cb8a46357142efa96651b3 WatchSource:0}: Error finding container f5bcec3b699cdf0f9f0db04a7fa76d46cf5931f896cb8a46357142efa96651b3: Status 404 returned error can't find the container with id f5bcec3b699cdf0f9f0db04a7fa76d46cf5931f896cb8a46357142efa96651b3 Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.138559 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-6m6pk"] Feb 17 13:45:36 crc kubenswrapper[4804]: W0217 13:45:36.156770 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2edd89a7_0866_4677_8b25_9654130c6ac5.slice/crio-e323ede546dfc16e7b70b5e9c8b973e450ff37d7035fb2de2ab1e2a23046c82c WatchSource:0}: Error finding container e323ede546dfc16e7b70b5e9c8b973e450ff37d7035fb2de2ab1e2a23046c82c: Status 404 returned error can't find the container with id e323ede546dfc16e7b70b5e9c8b973e450ff37d7035fb2de2ab1e2a23046c82c Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.421026 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-0898-account-create-update-6vpd7"] Feb 17 13:45:36 crc kubenswrapper[4804]: W0217 13:45:36.428713 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba7e6539_c0c9_40e7_b076_38cc23f233cc.slice/crio-a83919020b9aabb5aba4df6b823708e2580f5074e6ede1893518c8eddc8797b1 WatchSource:0}: Error finding container a83919020b9aabb5aba4df6b823708e2580f5074e6ede1893518c8eddc8797b1: Status 404 returned error can't find the container with id a83919020b9aabb5aba4df6b823708e2580f5074e6ede1893518c8eddc8797b1 Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.432891 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-c69gn"] Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.433513 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" podUID="50142f30-df04-4aa7-85e1-e303286966b7" containerName="dnsmasq-dns" containerID="cri-o://e01cd3f75d3aef51eac8d30928a4014bf5777a2284faed2dc4b55788bbefaed0" gracePeriod=10 Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.435410 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.487095 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" podUID="50142f30-df04-4aa7-85e1-e303286966b7" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.109:5353: connect: connection refused" Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.491437 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-mp4l9"] Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.493470 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.568412 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-mp4l9"] Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.625672 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-mp4l9\" (UID: \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\") " pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.625800 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfbbq\" (UniqueName: \"kubernetes.io/projected/86aca321-b4a3-4d89-ab34-5d311aa11fe9-kube-api-access-zfbbq\") pod \"dnsmasq-dns-b8fbc5445-mp4l9\" (UID: \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\") " pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.625859 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-mp4l9\" (UID: \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\") " pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.625937 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-config\") pod \"dnsmasq-dns-b8fbc5445-mp4l9\" (UID: \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\") " pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.625980 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-mp4l9\" (UID: \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\") " pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.727936 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-mp4l9\" (UID: \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\") " pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.728386 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfbbq\" (UniqueName: \"kubernetes.io/projected/86aca321-b4a3-4d89-ab34-5d311aa11fe9-kube-api-access-zfbbq\") pod \"dnsmasq-dns-b8fbc5445-mp4l9\" (UID: \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\") " pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.728808 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-mp4l9\" (UID: \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\") " pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.729093 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-config\") pod \"dnsmasq-dns-b8fbc5445-mp4l9\" (UID: \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\") " pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.729587 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-mp4l9\" (UID: \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\") " pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.729712 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-mp4l9\" (UID: \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\") " pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.732022 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-mp4l9\" (UID: \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\") " pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.732769 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-config\") pod \"dnsmasq-dns-b8fbc5445-mp4l9\" (UID: \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\") " pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.733241 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-mp4l9\" (UID: \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\") " pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.752008 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfbbq\" (UniqueName: \"kubernetes.io/projected/86aca321-b4a3-4d89-ab34-5d311aa11fe9-kube-api-access-zfbbq\") pod \"dnsmasq-dns-b8fbc5445-mp4l9\" (UID: \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\") " pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.858458 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0898-account-create-update-6vpd7" event={"ID":"ba7e6539-c0c9-40e7-b076-38cc23f233cc","Type":"ContainerStarted","Data":"f94b862fb364184a245162162ecd4a81dc390800d6adb7360015eea9da137ebf"} Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.858514 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0898-account-create-update-6vpd7" event={"ID":"ba7e6539-c0c9-40e7-b076-38cc23f233cc","Type":"ContainerStarted","Data":"a83919020b9aabb5aba4df6b823708e2580f5074e6ede1893518c8eddc8797b1"} Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.861298 4804 generic.go:334] "Generic (PLEG): container finished" podID="6f9dbe9b-ced6-453d-9f59-0d92e2a69043" containerID="df5f178d05ce64eb60f91663ba876543b059e11efed3814a687a5cde6c71f197" exitCode=0 Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.861416 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-886b-account-create-update-h84mx" event={"ID":"6f9dbe9b-ced6-453d-9f59-0d92e2a69043","Type":"ContainerDied","Data":"df5f178d05ce64eb60f91663ba876543b059e11efed3814a687a5cde6c71f197"} Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.861444 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-886b-account-create-update-h84mx" event={"ID":"6f9dbe9b-ced6-453d-9f59-0d92e2a69043","Type":"ContainerStarted","Data":"f5bcec3b699cdf0f9f0db04a7fa76d46cf5931f896cb8a46357142efa96651b3"} Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.862932 4804 generic.go:334] "Generic (PLEG): container finished" podID="4bc37bd5-6784-41f8-98de-ef6a43493cd6" containerID="e84b0f31988f4caf559aaf77b9c196ea5e660cca5bf9a529065d3d4f3f6186e1" exitCode=0 Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.862982 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dl5b9" event={"ID":"4bc37bd5-6784-41f8-98de-ef6a43493cd6","Type":"ContainerDied","Data":"e84b0f31988f4caf559aaf77b9c196ea5e660cca5bf9a529065d3d4f3f6186e1"} Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.862997 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dl5b9" event={"ID":"4bc37bd5-6784-41f8-98de-ef6a43493cd6","Type":"ContainerStarted","Data":"f20e76ded39c36574ceb45bd27901b8c625cfc377c6fbf0205827cf374bd32b0"} Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.870714 4804 generic.go:334] "Generic (PLEG): container finished" podID="50142f30-df04-4aa7-85e1-e303286966b7" containerID="e01cd3f75d3aef51eac8d30928a4014bf5777a2284faed2dc4b55788bbefaed0" exitCode=0 Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.870776 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" event={"ID":"50142f30-df04-4aa7-85e1-e303286966b7","Type":"ContainerDied","Data":"e01cd3f75d3aef51eac8d30928a4014bf5777a2284faed2dc4b55788bbefaed0"} Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.872900 4804 generic.go:334] "Generic (PLEG): container finished" podID="2edd89a7-0866-4677-8b25-9654130c6ac5" containerID="9b6aded40ee8715e414f7eaa0e4d2635fac772bb7db34b9cafa3737130656836" exitCode=0 Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.872958 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6m6pk" event={"ID":"2edd89a7-0866-4677-8b25-9654130c6ac5","Type":"ContainerDied","Data":"9b6aded40ee8715e414f7eaa0e4d2635fac772bb7db34b9cafa3737130656836"} Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.872978 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6m6pk" event={"ID":"2edd89a7-0866-4677-8b25-9654130c6ac5","Type":"ContainerStarted","Data":"e323ede546dfc16e7b70b5e9c8b973e450ff37d7035fb2de2ab1e2a23046c82c"} Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.874335 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:36 crc kubenswrapper[4804]: I0217 13:45:36.895130 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-0898-account-create-update-6vpd7" podStartSLOduration=1.895098746 podStartE2EDuration="1.895098746s" podCreationTimestamp="2026-02-17 13:45:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:45:36.875917615 +0000 UTC m=+1210.987336972" watchObservedRunningTime="2026-02-17 13:45:36.895098746 +0000 UTC m=+1211.006518103" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.014708 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.215254 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hbdkd" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.245785 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fb7k\" (UniqueName: \"kubernetes.io/projected/ed593eeb-17c6-42cc-a392-9fbb1f3aef6e-kube-api-access-5fb7k\") pod \"ed593eeb-17c6-42cc-a392-9fbb1f3aef6e\" (UID: \"ed593eeb-17c6-42cc-a392-9fbb1f3aef6e\") " Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.245930 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed593eeb-17c6-42cc-a392-9fbb1f3aef6e-operator-scripts\") pod \"ed593eeb-17c6-42cc-a392-9fbb1f3aef6e\" (UID: \"ed593eeb-17c6-42cc-a392-9fbb1f3aef6e\") " Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.246564 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed593eeb-17c6-42cc-a392-9fbb1f3aef6e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ed593eeb-17c6-42cc-a392-9fbb1f3aef6e" (UID: "ed593eeb-17c6-42cc-a392-9fbb1f3aef6e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.253316 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed593eeb-17c6-42cc-a392-9fbb1f3aef6e-kube-api-access-5fb7k" (OuterVolumeSpecName: "kube-api-access-5fb7k") pod "ed593eeb-17c6-42cc-a392-9fbb1f3aef6e" (UID: "ed593eeb-17c6-42cc-a392-9fbb1f3aef6e"). InnerVolumeSpecName "kube-api-access-5fb7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.347506 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fb7k\" (UniqueName: \"kubernetes.io/projected/ed593eeb-17c6-42cc-a392-9fbb1f3aef6e-kube-api-access-5fb7k\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.347557 4804 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed593eeb-17c6-42cc-a392-9fbb1f3aef6e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.487129 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.551005 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50142f30-df04-4aa7-85e1-e303286966b7-config\") pod \"50142f30-df04-4aa7-85e1-e303286966b7\" (UID: \"50142f30-df04-4aa7-85e1-e303286966b7\") " Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.551108 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50142f30-df04-4aa7-85e1-e303286966b7-dns-svc\") pod \"50142f30-df04-4aa7-85e1-e303286966b7\" (UID: \"50142f30-df04-4aa7-85e1-e303286966b7\") " Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.551209 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50142f30-df04-4aa7-85e1-e303286966b7-ovsdbserver-sb\") pod \"50142f30-df04-4aa7-85e1-e303286966b7\" (UID: \"50142f30-df04-4aa7-85e1-e303286966b7\") " Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.551348 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csrsv\" (UniqueName: \"kubernetes.io/projected/50142f30-df04-4aa7-85e1-e303286966b7-kube-api-access-csrsv\") pod \"50142f30-df04-4aa7-85e1-e303286966b7\" (UID: \"50142f30-df04-4aa7-85e1-e303286966b7\") " Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.558389 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50142f30-df04-4aa7-85e1-e303286966b7-kube-api-access-csrsv" (OuterVolumeSpecName: "kube-api-access-csrsv") pod "50142f30-df04-4aa7-85e1-e303286966b7" (UID: "50142f30-df04-4aa7-85e1-e303286966b7"). InnerVolumeSpecName "kube-api-access-csrsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.595481 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50142f30-df04-4aa7-85e1-e303286966b7-config" (OuterVolumeSpecName: "config") pod "50142f30-df04-4aa7-85e1-e303286966b7" (UID: "50142f30-df04-4aa7-85e1-e303286966b7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.596213 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50142f30-df04-4aa7-85e1-e303286966b7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "50142f30-df04-4aa7-85e1-e303286966b7" (UID: "50142f30-df04-4aa7-85e1-e303286966b7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.598521 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50142f30-df04-4aa7-85e1-e303286966b7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "50142f30-df04-4aa7-85e1-e303286966b7" (UID: "50142f30-df04-4aa7-85e1-e303286966b7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.598658 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-mp4l9"] Feb 17 13:45:37 crc kubenswrapper[4804]: W0217 13:45:37.603063 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86aca321_b4a3_4d89_ab34_5d311aa11fe9.slice/crio-cc022082e5090f1a0915d5020212d3b1d395c728921a669b0ed6b89573f0530f WatchSource:0}: Error finding container cc022082e5090f1a0915d5020212d3b1d395c728921a669b0ed6b89573f0530f: Status 404 returned error can't find the container with id cc022082e5090f1a0915d5020212d3b1d395c728921a669b0ed6b89573f0530f Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.612738 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 17 13:45:37 crc kubenswrapper[4804]: E0217 13:45:37.613170 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50142f30-df04-4aa7-85e1-e303286966b7" containerName="init" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.613195 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="50142f30-df04-4aa7-85e1-e303286966b7" containerName="init" Feb 17 13:45:37 crc kubenswrapper[4804]: E0217 13:45:37.613231 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50142f30-df04-4aa7-85e1-e303286966b7" containerName="dnsmasq-dns" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.613239 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="50142f30-df04-4aa7-85e1-e303286966b7" containerName="dnsmasq-dns" Feb 17 13:45:37 crc kubenswrapper[4804]: E0217 13:45:37.613266 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed593eeb-17c6-42cc-a392-9fbb1f3aef6e" containerName="mariadb-account-create-update" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.613274 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed593eeb-17c6-42cc-a392-9fbb1f3aef6e" containerName="mariadb-account-create-update" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.613485 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="50142f30-df04-4aa7-85e1-e303286966b7" containerName="dnsmasq-dns" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.613521 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed593eeb-17c6-42cc-a392-9fbb1f3aef6e" containerName="mariadb-account-create-update" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.619932 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.622622 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.622645 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-dl8mk" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.622689 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.622700 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.637029 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.663854 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcfhb\" (UniqueName: \"kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-kube-api-access-vcfhb\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.663911 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-etc-swift\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.663975 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.664025 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-lock\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.664091 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-cache\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.664122 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.664238 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csrsv\" (UniqueName: \"kubernetes.io/projected/50142f30-df04-4aa7-85e1-e303286966b7-kube-api-access-csrsv\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.664253 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50142f30-df04-4aa7-85e1-e303286966b7-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.664264 4804 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50142f30-df04-4aa7-85e1-e303286966b7-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.664275 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50142f30-df04-4aa7-85e1-e303286966b7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.765284 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-lock\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.765371 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-cache\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.765764 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.765778 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-cache\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.765917 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-lock\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.765952 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcfhb\" (UniqueName: \"kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-kube-api-access-vcfhb\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.766075 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-etc-swift\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.766153 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:37 crc kubenswrapper[4804]: E0217 13:45:37.766277 4804 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 13:45:37 crc kubenswrapper[4804]: E0217 13:45:37.766305 4804 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 13:45:37 crc kubenswrapper[4804]: E0217 13:45:37.766348 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-etc-swift podName:90da6e89-6033-4e42-a5ca-bed1a5ad6a46 nodeName:}" failed. No retries permitted until 2026-02-17 13:45:38.266332822 +0000 UTC m=+1212.377752159 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-etc-swift") pod "swift-storage-0" (UID: "90da6e89-6033-4e42-a5ca-bed1a5ad6a46") : configmap "swift-ring-files" not found Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.766978 4804 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/swift-storage-0" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.772730 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.783823 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcfhb\" (UniqueName: \"kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-kube-api-access-vcfhb\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.793831 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.890470 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" event={"ID":"50142f30-df04-4aa7-85e1-e303286966b7","Type":"ContainerDied","Data":"beea4edb9ab8a4290234d33096c41b7ec1bcf83e4fedb843a0fee43bc42ec3a0"} Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.890529 4804 scope.go:117] "RemoveContainer" containerID="e01cd3f75d3aef51eac8d30928a4014bf5777a2284faed2dc4b55788bbefaed0" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.890650 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-c69gn" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.894659 4804 generic.go:334] "Generic (PLEG): container finished" podID="86aca321-b4a3-4d89-ab34-5d311aa11fe9" containerID="2dffc4a386295a9c8c5efcda69222c6e5a062401a2d3d79656ce2b29324ac9c0" exitCode=0 Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.894766 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" event={"ID":"86aca321-b4a3-4d89-ab34-5d311aa11fe9","Type":"ContainerDied","Data":"2dffc4a386295a9c8c5efcda69222c6e5a062401a2d3d79656ce2b29324ac9c0"} Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.894793 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" event={"ID":"86aca321-b4a3-4d89-ab34-5d311aa11fe9","Type":"ContainerStarted","Data":"cc022082e5090f1a0915d5020212d3b1d395c728921a669b0ed6b89573f0530f"} Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.896093 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hbdkd" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.896140 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hbdkd" event={"ID":"ed593eeb-17c6-42cc-a392-9fbb1f3aef6e","Type":"ContainerDied","Data":"0e526352fe86de53d110e5db802aba66020e9334cff6f240f845e8f5d42b814d"} Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.896174 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e526352fe86de53d110e5db802aba66020e9334cff6f240f845e8f5d42b814d" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.897787 4804 generic.go:334] "Generic (PLEG): container finished" podID="ba7e6539-c0c9-40e7-b076-38cc23f233cc" containerID="f94b862fb364184a245162162ecd4a81dc390800d6adb7360015eea9da137ebf" exitCode=0 Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.897836 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0898-account-create-update-6vpd7" event={"ID":"ba7e6539-c0c9-40e7-b076-38cc23f233cc","Type":"ContainerDied","Data":"f94b862fb364184a245162162ecd4a81dc390800d6adb7360015eea9da137ebf"} Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.946092 4804 scope.go:117] "RemoveContainer" containerID="ab28572c13b5040bf4ce2b36ef6ec484d61c5f798e505ab5c93281f67d3def85" Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.978731 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-c69gn"] Feb 17 13:45:37 crc kubenswrapper[4804]: I0217 13:45:37.985359 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-c69gn"] Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.227281 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-886b-account-create-update-h84mx" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.286105 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f9dbe9b-ced6-453d-9f59-0d92e2a69043-operator-scripts\") pod \"6f9dbe9b-ced6-453d-9f59-0d92e2a69043\" (UID: \"6f9dbe9b-ced6-453d-9f59-0d92e2a69043\") " Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.286270 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mdgv\" (UniqueName: \"kubernetes.io/projected/6f9dbe9b-ced6-453d-9f59-0d92e2a69043-kube-api-access-5mdgv\") pod \"6f9dbe9b-ced6-453d-9f59-0d92e2a69043\" (UID: \"6f9dbe9b-ced6-453d-9f59-0d92e2a69043\") " Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.286522 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-etc-swift\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:38 crc kubenswrapper[4804]: E0217 13:45:38.286832 4804 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 13:45:38 crc kubenswrapper[4804]: E0217 13:45:38.286859 4804 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 13:45:38 crc kubenswrapper[4804]: E0217 13:45:38.286911 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-etc-swift podName:90da6e89-6033-4e42-a5ca-bed1a5ad6a46 nodeName:}" failed. No retries permitted until 2026-02-17 13:45:39.286891688 +0000 UTC m=+1213.398311025 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-etc-swift") pod "swift-storage-0" (UID: "90da6e89-6033-4e42-a5ca-bed1a5ad6a46") : configmap "swift-ring-files" not found Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.289937 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f9dbe9b-ced6-453d-9f59-0d92e2a69043-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6f9dbe9b-ced6-453d-9f59-0d92e2a69043" (UID: "6f9dbe9b-ced6-453d-9f59-0d92e2a69043"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.299825 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f9dbe9b-ced6-453d-9f59-0d92e2a69043-kube-api-access-5mdgv" (OuterVolumeSpecName: "kube-api-access-5mdgv") pod "6f9dbe9b-ced6-453d-9f59-0d92e2a69043" (UID: "6f9dbe9b-ced6-453d-9f59-0d92e2a69043"). InnerVolumeSpecName "kube-api-access-5mdgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.388258 4804 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f9dbe9b-ced6-453d-9f59-0d92e2a69043-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.388300 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mdgv\" (UniqueName: \"kubernetes.io/projected/6f9dbe9b-ced6-453d-9f59-0d92e2a69043-kube-api-access-5mdgv\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.406460 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dl5b9" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.409804 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6m6pk" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.489940 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqldx\" (UniqueName: \"kubernetes.io/projected/2edd89a7-0866-4677-8b25-9654130c6ac5-kube-api-access-nqldx\") pod \"2edd89a7-0866-4677-8b25-9654130c6ac5\" (UID: \"2edd89a7-0866-4677-8b25-9654130c6ac5\") " Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.490089 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4bc37bd5-6784-41f8-98de-ef6a43493cd6-operator-scripts\") pod \"4bc37bd5-6784-41f8-98de-ef6a43493cd6\" (UID: \"4bc37bd5-6784-41f8-98de-ef6a43493cd6\") " Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.490161 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qsxw\" (UniqueName: \"kubernetes.io/projected/4bc37bd5-6784-41f8-98de-ef6a43493cd6-kube-api-access-8qsxw\") pod \"4bc37bd5-6784-41f8-98de-ef6a43493cd6\" (UID: \"4bc37bd5-6784-41f8-98de-ef6a43493cd6\") " Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.490221 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2edd89a7-0866-4677-8b25-9654130c6ac5-operator-scripts\") pod \"2edd89a7-0866-4677-8b25-9654130c6ac5\" (UID: \"2edd89a7-0866-4677-8b25-9654130c6ac5\") " Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.490903 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bc37bd5-6784-41f8-98de-ef6a43493cd6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4bc37bd5-6784-41f8-98de-ef6a43493cd6" (UID: "4bc37bd5-6784-41f8-98de-ef6a43493cd6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.490964 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2edd89a7-0866-4677-8b25-9654130c6ac5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2edd89a7-0866-4677-8b25-9654130c6ac5" (UID: "2edd89a7-0866-4677-8b25-9654130c6ac5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.494327 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2edd89a7-0866-4677-8b25-9654130c6ac5-kube-api-access-nqldx" (OuterVolumeSpecName: "kube-api-access-nqldx") pod "2edd89a7-0866-4677-8b25-9654130c6ac5" (UID: "2edd89a7-0866-4677-8b25-9654130c6ac5"). InnerVolumeSpecName "kube-api-access-nqldx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.494494 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bc37bd5-6784-41f8-98de-ef6a43493cd6-kube-api-access-8qsxw" (OuterVolumeSpecName: "kube-api-access-8qsxw") pod "4bc37bd5-6784-41f8-98de-ef6a43493cd6" (UID: "4bc37bd5-6784-41f8-98de-ef6a43493cd6"). InnerVolumeSpecName "kube-api-access-8qsxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.584155 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50142f30-df04-4aa7-85e1-e303286966b7" path="/var/lib/kubelet/pods/50142f30-df04-4aa7-85e1-e303286966b7/volumes" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.604618 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqldx\" (UniqueName: \"kubernetes.io/projected/2edd89a7-0866-4677-8b25-9654130c6ac5-kube-api-access-nqldx\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.604656 4804 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4bc37bd5-6784-41f8-98de-ef6a43493cd6-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.604670 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qsxw\" (UniqueName: \"kubernetes.io/projected/4bc37bd5-6784-41f8-98de-ef6a43493cd6-kube-api-access-8qsxw\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.604683 4804 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2edd89a7-0866-4677-8b25-9654130c6ac5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.907377 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" event={"ID":"86aca321-b4a3-4d89-ab34-5d311aa11fe9","Type":"ContainerStarted","Data":"f571b6a8fe768059eccc20a850b20c199a06962e3cc1247c34676a921af27d49"} Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.907486 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.911746 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6m6pk" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.911733 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6m6pk" event={"ID":"2edd89a7-0866-4677-8b25-9654130c6ac5","Type":"ContainerDied","Data":"e323ede546dfc16e7b70b5e9c8b973e450ff37d7035fb2de2ab1e2a23046c82c"} Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.911891 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e323ede546dfc16e7b70b5e9c8b973e450ff37d7035fb2de2ab1e2a23046c82c" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.914984 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-886b-account-create-update-h84mx" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.915050 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-886b-account-create-update-h84mx" event={"ID":"6f9dbe9b-ced6-453d-9f59-0d92e2a69043","Type":"ContainerDied","Data":"f5bcec3b699cdf0f9f0db04a7fa76d46cf5931f896cb8a46357142efa96651b3"} Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.915078 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5bcec3b699cdf0f9f0db04a7fa76d46cf5931f896cb8a46357142efa96651b3" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.917872 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dl5b9" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.918506 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dl5b9" event={"ID":"4bc37bd5-6784-41f8-98de-ef6a43493cd6","Type":"ContainerDied","Data":"f20e76ded39c36574ceb45bd27901b8c625cfc377c6fbf0205827cf374bd32b0"} Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.918530 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f20e76ded39c36574ceb45bd27901b8c625cfc377c6fbf0205827cf374bd32b0" Feb 17 13:45:38 crc kubenswrapper[4804]: I0217 13:45:38.936928 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" podStartSLOduration=2.93690596 podStartE2EDuration="2.93690596s" podCreationTimestamp="2026-02-17 13:45:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:45:38.927499066 +0000 UTC m=+1213.038918413" watchObservedRunningTime="2026-02-17 13:45:38.93690596 +0000 UTC m=+1213.048325297" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.227772 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-v8tb5"] Feb 17 13:45:39 crc kubenswrapper[4804]: E0217 13:45:39.228554 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9dbe9b-ced6-453d-9f59-0d92e2a69043" containerName="mariadb-account-create-update" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.228574 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9dbe9b-ced6-453d-9f59-0d92e2a69043" containerName="mariadb-account-create-update" Feb 17 13:45:39 crc kubenswrapper[4804]: E0217 13:45:39.228607 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2edd89a7-0866-4677-8b25-9654130c6ac5" containerName="mariadb-database-create" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.228615 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="2edd89a7-0866-4677-8b25-9654130c6ac5" containerName="mariadb-database-create" Feb 17 13:45:39 crc kubenswrapper[4804]: E0217 13:45:39.228647 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bc37bd5-6784-41f8-98de-ef6a43493cd6" containerName="mariadb-database-create" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.228655 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bc37bd5-6784-41f8-98de-ef6a43493cd6" containerName="mariadb-database-create" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.228855 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bc37bd5-6784-41f8-98de-ef6a43493cd6" containerName="mariadb-database-create" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.228879 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9dbe9b-ced6-453d-9f59-0d92e2a69043" containerName="mariadb-account-create-update" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.228893 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="2edd89a7-0866-4677-8b25-9654130c6ac5" containerName="mariadb-database-create" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.229573 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-v8tb5" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.244176 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-v8tb5"] Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.256319 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0898-account-create-update-6vpd7" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.322417 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-f8a9-account-create-update-98wtk"] Feb 17 13:45:39 crc kubenswrapper[4804]: E0217 13:45:39.322717 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba7e6539-c0c9-40e7-b076-38cc23f233cc" containerName="mariadb-account-create-update" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.322728 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba7e6539-c0c9-40e7-b076-38cc23f233cc" containerName="mariadb-account-create-update" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.322895 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba7e6539-c0c9-40e7-b076-38cc23f233cc" containerName="mariadb-account-create-update" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.323845 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f8a9-account-create-update-98wtk" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.329154 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.329826 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba7e6539-c0c9-40e7-b076-38cc23f233cc-operator-scripts\") pod \"ba7e6539-c0c9-40e7-b076-38cc23f233cc\" (UID: \"ba7e6539-c0c9-40e7-b076-38cc23f233cc\") " Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.329911 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rvmk\" (UniqueName: \"kubernetes.io/projected/ba7e6539-c0c9-40e7-b076-38cc23f233cc-kube-api-access-6rvmk\") pod \"ba7e6539-c0c9-40e7-b076-38cc23f233cc\" (UID: \"ba7e6539-c0c9-40e7-b076-38cc23f233cc\") " Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.330242 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-etc-swift\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.330283 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrz7q\" (UniqueName: \"kubernetes.io/projected/4c8ee09a-97bd-4497-81cd-2f0f4952d996-kube-api-access-wrz7q\") pod \"glance-db-create-v8tb5\" (UID: \"4c8ee09a-97bd-4497-81cd-2f0f4952d996\") " pod="openstack/glance-db-create-v8tb5" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.330315 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c8ee09a-97bd-4497-81cd-2f0f4952d996-operator-scripts\") pod \"glance-db-create-v8tb5\" (UID: \"4c8ee09a-97bd-4497-81cd-2f0f4952d996\") " pod="openstack/glance-db-create-v8tb5" Feb 17 13:45:39 crc kubenswrapper[4804]: E0217 13:45:39.331015 4804 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 13:45:39 crc kubenswrapper[4804]: E0217 13:45:39.331038 4804 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 13:45:39 crc kubenswrapper[4804]: E0217 13:45:39.331084 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-etc-swift podName:90da6e89-6033-4e42-a5ca-bed1a5ad6a46 nodeName:}" failed. No retries permitted until 2026-02-17 13:45:41.331064585 +0000 UTC m=+1215.442483982 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-etc-swift") pod "swift-storage-0" (UID: "90da6e89-6033-4e42-a5ca-bed1a5ad6a46") : configmap "swift-ring-files" not found Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.331550 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba7e6539-c0c9-40e7-b076-38cc23f233cc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ba7e6539-c0c9-40e7-b076-38cc23f233cc" (UID: "ba7e6539-c0c9-40e7-b076-38cc23f233cc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.336390 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba7e6539-c0c9-40e7-b076-38cc23f233cc-kube-api-access-6rvmk" (OuterVolumeSpecName: "kube-api-access-6rvmk") pod "ba7e6539-c0c9-40e7-b076-38cc23f233cc" (UID: "ba7e6539-c0c9-40e7-b076-38cc23f233cc"). InnerVolumeSpecName "kube-api-access-6rvmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.340848 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f8a9-account-create-update-98wtk"] Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.432313 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrz7q\" (UniqueName: \"kubernetes.io/projected/4c8ee09a-97bd-4497-81cd-2f0f4952d996-kube-api-access-wrz7q\") pod \"glance-db-create-v8tb5\" (UID: \"4c8ee09a-97bd-4497-81cd-2f0f4952d996\") " pod="openstack/glance-db-create-v8tb5" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.432372 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj5wj\" (UniqueName: \"kubernetes.io/projected/b0597f43-df0a-427f-b045-e6859849a0d6-kube-api-access-gj5wj\") pod \"glance-f8a9-account-create-update-98wtk\" (UID: \"b0597f43-df0a-427f-b045-e6859849a0d6\") " pod="openstack/glance-f8a9-account-create-update-98wtk" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.432412 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c8ee09a-97bd-4497-81cd-2f0f4952d996-operator-scripts\") pod \"glance-db-create-v8tb5\" (UID: \"4c8ee09a-97bd-4497-81cd-2f0f4952d996\") " pod="openstack/glance-db-create-v8tb5" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.432439 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0597f43-df0a-427f-b045-e6859849a0d6-operator-scripts\") pod \"glance-f8a9-account-create-update-98wtk\" (UID: \"b0597f43-df0a-427f-b045-e6859849a0d6\") " pod="openstack/glance-f8a9-account-create-update-98wtk" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.432510 4804 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba7e6539-c0c9-40e7-b076-38cc23f233cc-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.432524 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rvmk\" (UniqueName: \"kubernetes.io/projected/ba7e6539-c0c9-40e7-b076-38cc23f233cc-kube-api-access-6rvmk\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.433375 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c8ee09a-97bd-4497-81cd-2f0f4952d996-operator-scripts\") pod \"glance-db-create-v8tb5\" (UID: \"4c8ee09a-97bd-4497-81cd-2f0f4952d996\") " pod="openstack/glance-db-create-v8tb5" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.451582 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrz7q\" (UniqueName: \"kubernetes.io/projected/4c8ee09a-97bd-4497-81cd-2f0f4952d996-kube-api-access-wrz7q\") pod \"glance-db-create-v8tb5\" (UID: \"4c8ee09a-97bd-4497-81cd-2f0f4952d996\") " pod="openstack/glance-db-create-v8tb5" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.534862 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj5wj\" (UniqueName: \"kubernetes.io/projected/b0597f43-df0a-427f-b045-e6859849a0d6-kube-api-access-gj5wj\") pod \"glance-f8a9-account-create-update-98wtk\" (UID: \"b0597f43-df0a-427f-b045-e6859849a0d6\") " pod="openstack/glance-f8a9-account-create-update-98wtk" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.534936 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0597f43-df0a-427f-b045-e6859849a0d6-operator-scripts\") pod \"glance-f8a9-account-create-update-98wtk\" (UID: \"b0597f43-df0a-427f-b045-e6859849a0d6\") " pod="openstack/glance-f8a9-account-create-update-98wtk" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.535932 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0597f43-df0a-427f-b045-e6859849a0d6-operator-scripts\") pod \"glance-f8a9-account-create-update-98wtk\" (UID: \"b0597f43-df0a-427f-b045-e6859849a0d6\") " pod="openstack/glance-f8a9-account-create-update-98wtk" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.553975 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj5wj\" (UniqueName: \"kubernetes.io/projected/b0597f43-df0a-427f-b045-e6859849a0d6-kube-api-access-gj5wj\") pod \"glance-f8a9-account-create-update-98wtk\" (UID: \"b0597f43-df0a-427f-b045-e6859849a0d6\") " pod="openstack/glance-f8a9-account-create-update-98wtk" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.568223 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-v8tb5" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.684737 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f8a9-account-create-update-98wtk" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.934174 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0898-account-create-update-6vpd7" event={"ID":"ba7e6539-c0c9-40e7-b076-38cc23f233cc","Type":"ContainerDied","Data":"a83919020b9aabb5aba4df6b823708e2580f5074e6ede1893518c8eddc8797b1"} Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.934619 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a83919020b9aabb5aba4df6b823708e2580f5074e6ede1893518c8eddc8797b1" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.934223 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0898-account-create-update-6vpd7" Feb 17 13:45:39 crc kubenswrapper[4804]: I0217 13:45:39.999808 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-v8tb5"] Feb 17 13:45:40 crc kubenswrapper[4804]: I0217 13:45:40.157874 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f8a9-account-create-update-98wtk"] Feb 17 13:45:40 crc kubenswrapper[4804]: W0217 13:45:40.160533 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0597f43_df0a_427f_b045_e6859849a0d6.slice/crio-25125cc9707c8d63ca516ea3904a1fab397302333018e923c2c08f764dea015f WatchSource:0}: Error finding container 25125cc9707c8d63ca516ea3904a1fab397302333018e923c2c08f764dea015f: Status 404 returned error can't find the container with id 25125cc9707c8d63ca516ea3904a1fab397302333018e923c2c08f764dea015f Feb 17 13:45:40 crc kubenswrapper[4804]: I0217 13:45:40.944401 4804 generic.go:334] "Generic (PLEG): container finished" podID="b0597f43-df0a-427f-b045-e6859849a0d6" containerID="523c2a0dce1e6efc07d04ec334853ccdc0d1e041c66ee6b003b630197674d70f" exitCode=0 Feb 17 13:45:40 crc kubenswrapper[4804]: I0217 13:45:40.944625 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f8a9-account-create-update-98wtk" event={"ID":"b0597f43-df0a-427f-b045-e6859849a0d6","Type":"ContainerDied","Data":"523c2a0dce1e6efc07d04ec334853ccdc0d1e041c66ee6b003b630197674d70f"} Feb 17 13:45:40 crc kubenswrapper[4804]: I0217 13:45:40.944789 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f8a9-account-create-update-98wtk" event={"ID":"b0597f43-df0a-427f-b045-e6859849a0d6","Type":"ContainerStarted","Data":"25125cc9707c8d63ca516ea3904a1fab397302333018e923c2c08f764dea015f"} Feb 17 13:45:40 crc kubenswrapper[4804]: I0217 13:45:40.946751 4804 generic.go:334] "Generic (PLEG): container finished" podID="4c8ee09a-97bd-4497-81cd-2f0f4952d996" containerID="5707e03ce1413559d6e451944a8178ed7c1374503c523227f07af12a0d1deda1" exitCode=0 Feb 17 13:45:40 crc kubenswrapper[4804]: I0217 13:45:40.946786 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-v8tb5" event={"ID":"4c8ee09a-97bd-4497-81cd-2f0f4952d996","Type":"ContainerDied","Data":"5707e03ce1413559d6e451944a8178ed7c1374503c523227f07af12a0d1deda1"} Feb 17 13:45:40 crc kubenswrapper[4804]: I0217 13:45:40.946805 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-v8tb5" event={"ID":"4c8ee09a-97bd-4497-81cd-2f0f4952d996","Type":"ContainerStarted","Data":"edb5c73dae0b867ad78d1e3312eb0f89bf13d520b78c11a76297e212dcf745ce"} Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.414943 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-etc-swift\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:41 crc kubenswrapper[4804]: E0217 13:45:41.415256 4804 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 13:45:41 crc kubenswrapper[4804]: E0217 13:45:41.415291 4804 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 13:45:41 crc kubenswrapper[4804]: E0217 13:45:41.415366 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-etc-swift podName:90da6e89-6033-4e42-a5ca-bed1a5ad6a46 nodeName:}" failed. No retries permitted until 2026-02-17 13:45:45.41534205 +0000 UTC m=+1219.526761387 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-etc-swift") pod "swift-storage-0" (UID: "90da6e89-6033-4e42-a5ca-bed1a5ad6a46") : configmap "swift-ring-files" not found Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.471461 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-mv8w5"] Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.472474 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.474297 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.474710 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.475893 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.486216 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-mv8w5"] Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.619890 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-etc-swift\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.619955 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85nm9\" (UniqueName: \"kubernetes.io/projected/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-kube-api-access-85nm9\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.619977 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-ring-data-devices\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.620008 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-combined-ca-bundle\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.620193 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-dispersionconf\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.620351 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-swiftconf\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.620486 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-scripts\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.721907 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-dispersionconf\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.721985 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-swiftconf\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.722022 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-scripts\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.722136 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-etc-swift\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.722180 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85nm9\" (UniqueName: \"kubernetes.io/projected/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-kube-api-access-85nm9\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.722206 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-ring-data-devices\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.722298 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-combined-ca-bundle\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.722902 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-etc-swift\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.723028 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-scripts\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.723278 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-ring-data-devices\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.728954 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-combined-ca-bundle\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.730473 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-dispersionconf\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.730696 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-swiftconf\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.745732 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85nm9\" (UniqueName: \"kubernetes.io/projected/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-kube-api-access-85nm9\") pod \"swift-ring-rebalance-mv8w5\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:41 crc kubenswrapper[4804]: I0217 13:45:41.798791 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.213005 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-mv8w5"] Feb 17 13:45:42 crc kubenswrapper[4804]: W0217 13:45:42.221765 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41aa78f0_ef58_4a36_b1f9_ce222fd8e1e2.slice/crio-a563f1cc47989fe64effde3ad5ba60476f9907f0b572451fd849f90b4a6e4fa8 WatchSource:0}: Error finding container a563f1cc47989fe64effde3ad5ba60476f9907f0b572451fd849f90b4a6e4fa8: Status 404 returned error can't find the container with id a563f1cc47989fe64effde3ad5ba60476f9907f0b572451fd849f90b4a6e4fa8 Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.255706 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-v8tb5" Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.317203 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f8a9-account-create-update-98wtk" Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.433015 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c8ee09a-97bd-4497-81cd-2f0f4952d996-operator-scripts\") pod \"4c8ee09a-97bd-4497-81cd-2f0f4952d996\" (UID: \"4c8ee09a-97bd-4497-81cd-2f0f4952d996\") " Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.433114 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrz7q\" (UniqueName: \"kubernetes.io/projected/4c8ee09a-97bd-4497-81cd-2f0f4952d996-kube-api-access-wrz7q\") pod \"4c8ee09a-97bd-4497-81cd-2f0f4952d996\" (UID: \"4c8ee09a-97bd-4497-81cd-2f0f4952d996\") " Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.433230 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0597f43-df0a-427f-b045-e6859849a0d6-operator-scripts\") pod \"b0597f43-df0a-427f-b045-e6859849a0d6\" (UID: \"b0597f43-df0a-427f-b045-e6859849a0d6\") " Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.433295 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gj5wj\" (UniqueName: \"kubernetes.io/projected/b0597f43-df0a-427f-b045-e6859849a0d6-kube-api-access-gj5wj\") pod \"b0597f43-df0a-427f-b045-e6859849a0d6\" (UID: \"b0597f43-df0a-427f-b045-e6859849a0d6\") " Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.434109 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0597f43-df0a-427f-b045-e6859849a0d6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b0597f43-df0a-427f-b045-e6859849a0d6" (UID: "b0597f43-df0a-427f-b045-e6859849a0d6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.434438 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c8ee09a-97bd-4497-81cd-2f0f4952d996-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4c8ee09a-97bd-4497-81cd-2f0f4952d996" (UID: "4c8ee09a-97bd-4497-81cd-2f0f4952d996"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.438350 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0597f43-df0a-427f-b045-e6859849a0d6-kube-api-access-gj5wj" (OuterVolumeSpecName: "kube-api-access-gj5wj") pod "b0597f43-df0a-427f-b045-e6859849a0d6" (UID: "b0597f43-df0a-427f-b045-e6859849a0d6"). InnerVolumeSpecName "kube-api-access-gj5wj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.438412 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c8ee09a-97bd-4497-81cd-2f0f4952d996-kube-api-access-wrz7q" (OuterVolumeSpecName: "kube-api-access-wrz7q") pod "4c8ee09a-97bd-4497-81cd-2f0f4952d996" (UID: "4c8ee09a-97bd-4497-81cd-2f0f4952d996"). InnerVolumeSpecName "kube-api-access-wrz7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.535557 4804 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c8ee09a-97bd-4497-81cd-2f0f4952d996-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.535590 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrz7q\" (UniqueName: \"kubernetes.io/projected/4c8ee09a-97bd-4497-81cd-2f0f4952d996-kube-api-access-wrz7q\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.535601 4804 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0597f43-df0a-427f-b045-e6859849a0d6-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.535610 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gj5wj\" (UniqueName: \"kubernetes.io/projected/b0597f43-df0a-427f-b045-e6859849a0d6-kube-api-access-gj5wj\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.623224 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-hbdkd"] Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.629567 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-hbdkd"] Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.961156 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mv8w5" event={"ID":"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2","Type":"ContainerStarted","Data":"a563f1cc47989fe64effde3ad5ba60476f9907f0b572451fd849f90b4a6e4fa8"} Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.963042 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f8a9-account-create-update-98wtk" event={"ID":"b0597f43-df0a-427f-b045-e6859849a0d6","Type":"ContainerDied","Data":"25125cc9707c8d63ca516ea3904a1fab397302333018e923c2c08f764dea015f"} Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.963059 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f8a9-account-create-update-98wtk" Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.963068 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25125cc9707c8d63ca516ea3904a1fab397302333018e923c2c08f764dea015f" Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.966391 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-v8tb5" event={"ID":"4c8ee09a-97bd-4497-81cd-2f0f4952d996","Type":"ContainerDied","Data":"edb5c73dae0b867ad78d1e3312eb0f89bf13d520b78c11a76297e212dcf745ce"} Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.966523 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-v8tb5" Feb 17 13:45:42 crc kubenswrapper[4804]: I0217 13:45:42.966521 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edb5c73dae0b867ad78d1e3312eb0f89bf13d520b78c11a76297e212dcf745ce" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.504557 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-lpd9f"] Feb 17 13:45:44 crc kubenswrapper[4804]: E0217 13:45:44.506690 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0597f43-df0a-427f-b045-e6859849a0d6" containerName="mariadb-account-create-update" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.506714 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0597f43-df0a-427f-b045-e6859849a0d6" containerName="mariadb-account-create-update" Feb 17 13:45:44 crc kubenswrapper[4804]: E0217 13:45:44.506738 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c8ee09a-97bd-4497-81cd-2f0f4952d996" containerName="mariadb-database-create" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.506744 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c8ee09a-97bd-4497-81cd-2f0f4952d996" containerName="mariadb-database-create" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.506956 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0597f43-df0a-427f-b045-e6859849a0d6" containerName="mariadb-account-create-update" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.506981 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c8ee09a-97bd-4497-81cd-2f0f4952d996" containerName="mariadb-database-create" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.507645 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lpd9f" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.513855 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.514576 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-r5s28" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.520183 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-lpd9f"] Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.585821 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed593eeb-17c6-42cc-a392-9fbb1f3aef6e" path="/var/lib/kubelet/pods/ed593eeb-17c6-42cc-a392-9fbb1f3aef6e/volumes" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.671195 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfb6c8ec-f280-4566-bb37-b286119956b5-config-data\") pod \"glance-db-sync-lpd9f\" (UID: \"dfb6c8ec-f280-4566-bb37-b286119956b5\") " pod="openstack/glance-db-sync-lpd9f" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.671303 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfb6c8ec-f280-4566-bb37-b286119956b5-combined-ca-bundle\") pod \"glance-db-sync-lpd9f\" (UID: \"dfb6c8ec-f280-4566-bb37-b286119956b5\") " pod="openstack/glance-db-sync-lpd9f" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.671355 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkcbn\" (UniqueName: \"kubernetes.io/projected/dfb6c8ec-f280-4566-bb37-b286119956b5-kube-api-access-zkcbn\") pod \"glance-db-sync-lpd9f\" (UID: \"dfb6c8ec-f280-4566-bb37-b286119956b5\") " pod="openstack/glance-db-sync-lpd9f" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.671631 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dfb6c8ec-f280-4566-bb37-b286119956b5-db-sync-config-data\") pod \"glance-db-sync-lpd9f\" (UID: \"dfb6c8ec-f280-4566-bb37-b286119956b5\") " pod="openstack/glance-db-sync-lpd9f" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.774527 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfb6c8ec-f280-4566-bb37-b286119956b5-config-data\") pod \"glance-db-sync-lpd9f\" (UID: \"dfb6c8ec-f280-4566-bb37-b286119956b5\") " pod="openstack/glance-db-sync-lpd9f" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.774627 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfb6c8ec-f280-4566-bb37-b286119956b5-combined-ca-bundle\") pod \"glance-db-sync-lpd9f\" (UID: \"dfb6c8ec-f280-4566-bb37-b286119956b5\") " pod="openstack/glance-db-sync-lpd9f" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.774708 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkcbn\" (UniqueName: \"kubernetes.io/projected/dfb6c8ec-f280-4566-bb37-b286119956b5-kube-api-access-zkcbn\") pod \"glance-db-sync-lpd9f\" (UID: \"dfb6c8ec-f280-4566-bb37-b286119956b5\") " pod="openstack/glance-db-sync-lpd9f" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.774871 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dfb6c8ec-f280-4566-bb37-b286119956b5-db-sync-config-data\") pod \"glance-db-sync-lpd9f\" (UID: \"dfb6c8ec-f280-4566-bb37-b286119956b5\") " pod="openstack/glance-db-sync-lpd9f" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.782519 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfb6c8ec-f280-4566-bb37-b286119956b5-config-data\") pod \"glance-db-sync-lpd9f\" (UID: \"dfb6c8ec-f280-4566-bb37-b286119956b5\") " pod="openstack/glance-db-sync-lpd9f" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.783505 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dfb6c8ec-f280-4566-bb37-b286119956b5-db-sync-config-data\") pod \"glance-db-sync-lpd9f\" (UID: \"dfb6c8ec-f280-4566-bb37-b286119956b5\") " pod="openstack/glance-db-sync-lpd9f" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.784177 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfb6c8ec-f280-4566-bb37-b286119956b5-combined-ca-bundle\") pod \"glance-db-sync-lpd9f\" (UID: \"dfb6c8ec-f280-4566-bb37-b286119956b5\") " pod="openstack/glance-db-sync-lpd9f" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.797335 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkcbn\" (UniqueName: \"kubernetes.io/projected/dfb6c8ec-f280-4566-bb37-b286119956b5-kube-api-access-zkcbn\") pod \"glance-db-sync-lpd9f\" (UID: \"dfb6c8ec-f280-4566-bb37-b286119956b5\") " pod="openstack/glance-db-sync-lpd9f" Feb 17 13:45:44 crc kubenswrapper[4804]: I0217 13:45:44.828366 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lpd9f" Feb 17 13:45:45 crc kubenswrapper[4804]: I0217 13:45:45.487440 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-etc-swift\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:45 crc kubenswrapper[4804]: E0217 13:45:45.487669 4804 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 13:45:45 crc kubenswrapper[4804]: E0217 13:45:45.488166 4804 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 13:45:45 crc kubenswrapper[4804]: E0217 13:45:45.488250 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-etc-swift podName:90da6e89-6033-4e42-a5ca-bed1a5ad6a46 nodeName:}" failed. No retries permitted until 2026-02-17 13:45:53.488227173 +0000 UTC m=+1227.599646510 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-etc-swift") pod "swift-storage-0" (UID: "90da6e89-6033-4e42-a5ca-bed1a5ad6a46") : configmap "swift-ring-files" not found Feb 17 13:45:45 crc kubenswrapper[4804]: I0217 13:45:45.715552 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-lpd9f"] Feb 17 13:45:45 crc kubenswrapper[4804]: W0217 13:45:45.716525 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfb6c8ec_f280_4566_bb37_b286119956b5.slice/crio-8b44ab8e952e1f3a0a808b705734302f9d0fba531b5c5b5df0002ed2ff150b18 WatchSource:0}: Error finding container 8b44ab8e952e1f3a0a808b705734302f9d0fba531b5c5b5df0002ed2ff150b18: Status 404 returned error can't find the container with id 8b44ab8e952e1f3a0a808b705734302f9d0fba531b5c5b5df0002ed2ff150b18 Feb 17 13:45:45 crc kubenswrapper[4804]: I0217 13:45:45.998886 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lpd9f" event={"ID":"dfb6c8ec-f280-4566-bb37-b286119956b5","Type":"ContainerStarted","Data":"8b44ab8e952e1f3a0a808b705734302f9d0fba531b5c5b5df0002ed2ff150b18"} Feb 17 13:45:46 crc kubenswrapper[4804]: I0217 13:45:46.000629 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mv8w5" event={"ID":"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2","Type":"ContainerStarted","Data":"acbd8ddba5d51200f8256011420ddf0cc657b7bccf8bce2bdfa4bb2a827a329a"} Feb 17 13:45:46 crc kubenswrapper[4804]: I0217 13:45:46.020439 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-mv8w5" podStartSLOduration=2.018387553 podStartE2EDuration="5.020421643s" podCreationTimestamp="2026-02-17 13:45:41 +0000 UTC" firstStartedPulling="2026-02-17 13:45:42.225857764 +0000 UTC m=+1216.337277101" lastFinishedPulling="2026-02-17 13:45:45.227891824 +0000 UTC m=+1219.339311191" observedRunningTime="2026-02-17 13:45:46.015391326 +0000 UTC m=+1220.126810663" watchObservedRunningTime="2026-02-17 13:45:46.020421643 +0000 UTC m=+1220.131840980" Feb 17 13:45:46 crc kubenswrapper[4804]: I0217 13:45:46.760632 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.016364 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.110691 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-s5nsf"] Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.111057 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-s5nsf" podUID="f2e35955-0967-4a9c-b4e5-68316c98d58f" containerName="dnsmasq-dns" containerID="cri-o://4e8269b54fecdae67dd01feae8090bcedbeb3d750c8ec050f707a3d55a90d7c2" gracePeriod=10 Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.548458 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.627601 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-ovsdbserver-sb\") pod \"f2e35955-0967-4a9c-b4e5-68316c98d58f\" (UID: \"f2e35955-0967-4a9c-b4e5-68316c98d58f\") " Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.627662 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-config\") pod \"f2e35955-0967-4a9c-b4e5-68316c98d58f\" (UID: \"f2e35955-0967-4a9c-b4e5-68316c98d58f\") " Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.627686 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-dns-svc\") pod \"f2e35955-0967-4a9c-b4e5-68316c98d58f\" (UID: \"f2e35955-0967-4a9c-b4e5-68316c98d58f\") " Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.627704 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-ovsdbserver-nb\") pod \"f2e35955-0967-4a9c-b4e5-68316c98d58f\" (UID: \"f2e35955-0967-4a9c-b4e5-68316c98d58f\") " Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.668765 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-c8wmz"] Feb 17 13:45:47 crc kubenswrapper[4804]: E0217 13:45:47.671418 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2e35955-0967-4a9c-b4e5-68316c98d58f" containerName="dnsmasq-dns" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.671461 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2e35955-0967-4a9c-b4e5-68316c98d58f" containerName="dnsmasq-dns" Feb 17 13:45:47 crc kubenswrapper[4804]: E0217 13:45:47.671602 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2e35955-0967-4a9c-b4e5-68316c98d58f" containerName="init" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.671611 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2e35955-0967-4a9c-b4e5-68316c98d58f" containerName="init" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.672229 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2e35955-0967-4a9c-b4e5-68316c98d58f" containerName="dnsmasq-dns" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.672989 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-c8wmz" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.681908 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.690200 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-c8wmz"] Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.709095 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f2e35955-0967-4a9c-b4e5-68316c98d58f" (UID: "f2e35955-0967-4a9c-b4e5-68316c98d58f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.718726 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f2e35955-0967-4a9c-b4e5-68316c98d58f" (UID: "f2e35955-0967-4a9c-b4e5-68316c98d58f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.720009 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f2e35955-0967-4a9c-b4e5-68316c98d58f" (UID: "f2e35955-0967-4a9c-b4e5-68316c98d58f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.723018 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-config" (OuterVolumeSpecName: "config") pod "f2e35955-0967-4a9c-b4e5-68316c98d58f" (UID: "f2e35955-0967-4a9c-b4e5-68316c98d58f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.728733 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7ndx\" (UniqueName: \"kubernetes.io/projected/f2e35955-0967-4a9c-b4e5-68316c98d58f-kube-api-access-n7ndx\") pod \"f2e35955-0967-4a9c-b4e5-68316c98d58f\" (UID: \"f2e35955-0967-4a9c-b4e5-68316c98d58f\") " Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.731003 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.731041 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.731052 4804 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.731060 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2e35955-0967-4a9c-b4e5-68316c98d58f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.733142 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2e35955-0967-4a9c-b4e5-68316c98d58f-kube-api-access-n7ndx" (OuterVolumeSpecName: "kube-api-access-n7ndx") pod "f2e35955-0967-4a9c-b4e5-68316c98d58f" (UID: "f2e35955-0967-4a9c-b4e5-68316c98d58f"). InnerVolumeSpecName "kube-api-access-n7ndx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.832711 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kv9q\" (UniqueName: \"kubernetes.io/projected/6c3b824f-ae3d-4681-8b14-16099a2643d5-kube-api-access-4kv9q\") pod \"root-account-create-update-c8wmz\" (UID: \"6c3b824f-ae3d-4681-8b14-16099a2643d5\") " pod="openstack/root-account-create-update-c8wmz" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.832887 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c3b824f-ae3d-4681-8b14-16099a2643d5-operator-scripts\") pod \"root-account-create-update-c8wmz\" (UID: \"6c3b824f-ae3d-4681-8b14-16099a2643d5\") " pod="openstack/root-account-create-update-c8wmz" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.832990 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7ndx\" (UniqueName: \"kubernetes.io/projected/f2e35955-0967-4a9c-b4e5-68316c98d58f-kube-api-access-n7ndx\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.934625 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kv9q\" (UniqueName: \"kubernetes.io/projected/6c3b824f-ae3d-4681-8b14-16099a2643d5-kube-api-access-4kv9q\") pod \"root-account-create-update-c8wmz\" (UID: \"6c3b824f-ae3d-4681-8b14-16099a2643d5\") " pod="openstack/root-account-create-update-c8wmz" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.934783 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c3b824f-ae3d-4681-8b14-16099a2643d5-operator-scripts\") pod \"root-account-create-update-c8wmz\" (UID: \"6c3b824f-ae3d-4681-8b14-16099a2643d5\") " pod="openstack/root-account-create-update-c8wmz" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.935447 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c3b824f-ae3d-4681-8b14-16099a2643d5-operator-scripts\") pod \"root-account-create-update-c8wmz\" (UID: \"6c3b824f-ae3d-4681-8b14-16099a2643d5\") " pod="openstack/root-account-create-update-c8wmz" Feb 17 13:45:47 crc kubenswrapper[4804]: I0217 13:45:47.953361 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kv9q\" (UniqueName: \"kubernetes.io/projected/6c3b824f-ae3d-4681-8b14-16099a2643d5-kube-api-access-4kv9q\") pod \"root-account-create-update-c8wmz\" (UID: \"6c3b824f-ae3d-4681-8b14-16099a2643d5\") " pod="openstack/root-account-create-update-c8wmz" Feb 17 13:45:48 crc kubenswrapper[4804]: I0217 13:45:48.021260 4804 generic.go:334] "Generic (PLEG): container finished" podID="bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" containerID="de02dbb74f45601647c918b390d5f93cfff604870702fca3316aca846c6db162" exitCode=0 Feb 17 13:45:48 crc kubenswrapper[4804]: I0217 13:45:48.021330 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad","Type":"ContainerDied","Data":"de02dbb74f45601647c918b390d5f93cfff604870702fca3316aca846c6db162"} Feb 17 13:45:48 crc kubenswrapper[4804]: I0217 13:45:48.025556 4804 generic.go:334] "Generic (PLEG): container finished" podID="f2e35955-0967-4a9c-b4e5-68316c98d58f" containerID="4e8269b54fecdae67dd01feae8090bcedbeb3d750c8ec050f707a3d55a90d7c2" exitCode=0 Feb 17 13:45:48 crc kubenswrapper[4804]: I0217 13:45:48.025690 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-s5nsf" event={"ID":"f2e35955-0967-4a9c-b4e5-68316c98d58f","Type":"ContainerDied","Data":"4e8269b54fecdae67dd01feae8090bcedbeb3d750c8ec050f707a3d55a90d7c2"} Feb 17 13:45:48 crc kubenswrapper[4804]: I0217 13:45:48.025732 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-s5nsf" event={"ID":"f2e35955-0967-4a9c-b4e5-68316c98d58f","Type":"ContainerDied","Data":"2fd594aa91077237ced828ad76ea96d9e73ca61204c2b63332a894fe7e26b921"} Feb 17 13:45:48 crc kubenswrapper[4804]: I0217 13:45:48.025773 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-s5nsf" Feb 17 13:45:48 crc kubenswrapper[4804]: I0217 13:45:48.025790 4804 scope.go:117] "RemoveContainer" containerID="4e8269b54fecdae67dd01feae8090bcedbeb3d750c8ec050f707a3d55a90d7c2" Feb 17 13:45:48 crc kubenswrapper[4804]: I0217 13:45:48.069476 4804 scope.go:117] "RemoveContainer" containerID="84e8ddfc4492066363704b53f96c7ec893c142c41860e555ddf4bd60f09fd51e" Feb 17 13:45:48 crc kubenswrapper[4804]: I0217 13:45:48.070659 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-c8wmz" Feb 17 13:45:48 crc kubenswrapper[4804]: I0217 13:45:48.075731 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-s5nsf"] Feb 17 13:45:48 crc kubenswrapper[4804]: I0217 13:45:48.084615 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-s5nsf"] Feb 17 13:45:48 crc kubenswrapper[4804]: I0217 13:45:48.095369 4804 scope.go:117] "RemoveContainer" containerID="4e8269b54fecdae67dd01feae8090bcedbeb3d750c8ec050f707a3d55a90d7c2" Feb 17 13:45:48 crc kubenswrapper[4804]: E0217 13:45:48.095756 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e8269b54fecdae67dd01feae8090bcedbeb3d750c8ec050f707a3d55a90d7c2\": container with ID starting with 4e8269b54fecdae67dd01feae8090bcedbeb3d750c8ec050f707a3d55a90d7c2 not found: ID does not exist" containerID="4e8269b54fecdae67dd01feae8090bcedbeb3d750c8ec050f707a3d55a90d7c2" Feb 17 13:45:48 crc kubenswrapper[4804]: I0217 13:45:48.095800 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e8269b54fecdae67dd01feae8090bcedbeb3d750c8ec050f707a3d55a90d7c2"} err="failed to get container status \"4e8269b54fecdae67dd01feae8090bcedbeb3d750c8ec050f707a3d55a90d7c2\": rpc error: code = NotFound desc = could not find container \"4e8269b54fecdae67dd01feae8090bcedbeb3d750c8ec050f707a3d55a90d7c2\": container with ID starting with 4e8269b54fecdae67dd01feae8090bcedbeb3d750c8ec050f707a3d55a90d7c2 not found: ID does not exist" Feb 17 13:45:48 crc kubenswrapper[4804]: I0217 13:45:48.095834 4804 scope.go:117] "RemoveContainer" containerID="84e8ddfc4492066363704b53f96c7ec893c142c41860e555ddf4bd60f09fd51e" Feb 17 13:45:48 crc kubenswrapper[4804]: E0217 13:45:48.096312 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84e8ddfc4492066363704b53f96c7ec893c142c41860e555ddf4bd60f09fd51e\": container with ID starting with 84e8ddfc4492066363704b53f96c7ec893c142c41860e555ddf4bd60f09fd51e not found: ID does not exist" containerID="84e8ddfc4492066363704b53f96c7ec893c142c41860e555ddf4bd60f09fd51e" Feb 17 13:45:48 crc kubenswrapper[4804]: I0217 13:45:48.096350 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84e8ddfc4492066363704b53f96c7ec893c142c41860e555ddf4bd60f09fd51e"} err="failed to get container status \"84e8ddfc4492066363704b53f96c7ec893c142c41860e555ddf4bd60f09fd51e\": rpc error: code = NotFound desc = could not find container \"84e8ddfc4492066363704b53f96c7ec893c142c41860e555ddf4bd60f09fd51e\": container with ID starting with 84e8ddfc4492066363704b53f96c7ec893c142c41860e555ddf4bd60f09fd51e not found: ID does not exist" Feb 17 13:45:48 crc kubenswrapper[4804]: I0217 13:45:48.510349 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-c8wmz"] Feb 17 13:45:48 crc kubenswrapper[4804]: W0217 13:45:48.517802 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c3b824f_ae3d_4681_8b14_16099a2643d5.slice/crio-756263fa82eb7a6414b2f41ee32414c158acfb11d5c99ff9536bacfe978bdb28 WatchSource:0}: Error finding container 756263fa82eb7a6414b2f41ee32414c158acfb11d5c99ff9536bacfe978bdb28: Status 404 returned error can't find the container with id 756263fa82eb7a6414b2f41ee32414c158acfb11d5c99ff9536bacfe978bdb28 Feb 17 13:45:48 crc kubenswrapper[4804]: I0217 13:45:48.583402 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2e35955-0967-4a9c-b4e5-68316c98d58f" path="/var/lib/kubelet/pods/f2e35955-0967-4a9c-b4e5-68316c98d58f/volumes" Feb 17 13:45:49 crc kubenswrapper[4804]: I0217 13:45:49.033796 4804 generic.go:334] "Generic (PLEG): container finished" podID="6c3b824f-ae3d-4681-8b14-16099a2643d5" containerID="a042fc58bb60ee18221f1218414ff109d197e288fe316a76abf5d21b41df0c21" exitCode=0 Feb 17 13:45:49 crc kubenswrapper[4804]: I0217 13:45:49.033862 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-c8wmz" event={"ID":"6c3b824f-ae3d-4681-8b14-16099a2643d5","Type":"ContainerDied","Data":"a042fc58bb60ee18221f1218414ff109d197e288fe316a76abf5d21b41df0c21"} Feb 17 13:45:49 crc kubenswrapper[4804]: I0217 13:45:49.033900 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-c8wmz" event={"ID":"6c3b824f-ae3d-4681-8b14-16099a2643d5","Type":"ContainerStarted","Data":"756263fa82eb7a6414b2f41ee32414c158acfb11d5c99ff9536bacfe978bdb28"} Feb 17 13:45:49 crc kubenswrapper[4804]: I0217 13:45:49.035265 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad","Type":"ContainerStarted","Data":"e223242f2f9a06365d51771062ed7df23bbf7ec9bda6057f41d25fb9aed813cb"} Feb 17 13:45:49 crc kubenswrapper[4804]: I0217 13:45:49.035931 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:45:49 crc kubenswrapper[4804]: I0217 13:45:49.077438 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.774246489 podStartE2EDuration="59.077410486s" podCreationTimestamp="2026-02-17 13:44:50 +0000 UTC" firstStartedPulling="2026-02-17 13:44:52.619459199 +0000 UTC m=+1166.730878536" lastFinishedPulling="2026-02-17 13:45:13.922623196 +0000 UTC m=+1188.034042533" observedRunningTime="2026-02-17 13:45:49.073156293 +0000 UTC m=+1223.184575650" watchObservedRunningTime="2026-02-17 13:45:49.077410486 +0000 UTC m=+1223.188829833" Feb 17 13:45:50 crc kubenswrapper[4804]: I0217 13:45:50.438090 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-c8wmz" Feb 17 13:45:50 crc kubenswrapper[4804]: I0217 13:45:50.587755 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c3b824f-ae3d-4681-8b14-16099a2643d5-operator-scripts\") pod \"6c3b824f-ae3d-4681-8b14-16099a2643d5\" (UID: \"6c3b824f-ae3d-4681-8b14-16099a2643d5\") " Feb 17 13:45:50 crc kubenswrapper[4804]: I0217 13:45:50.588492 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kv9q\" (UniqueName: \"kubernetes.io/projected/6c3b824f-ae3d-4681-8b14-16099a2643d5-kube-api-access-4kv9q\") pod \"6c3b824f-ae3d-4681-8b14-16099a2643d5\" (UID: \"6c3b824f-ae3d-4681-8b14-16099a2643d5\") " Feb 17 13:45:50 crc kubenswrapper[4804]: I0217 13:45:50.588640 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c3b824f-ae3d-4681-8b14-16099a2643d5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6c3b824f-ae3d-4681-8b14-16099a2643d5" (UID: "6c3b824f-ae3d-4681-8b14-16099a2643d5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:50 crc kubenswrapper[4804]: I0217 13:45:50.589161 4804 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c3b824f-ae3d-4681-8b14-16099a2643d5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:50 crc kubenswrapper[4804]: I0217 13:45:50.609441 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c3b824f-ae3d-4681-8b14-16099a2643d5-kube-api-access-4kv9q" (OuterVolumeSpecName: "kube-api-access-4kv9q") pod "6c3b824f-ae3d-4681-8b14-16099a2643d5" (UID: "6c3b824f-ae3d-4681-8b14-16099a2643d5"). InnerVolumeSpecName "kube-api-access-4kv9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:45:50 crc kubenswrapper[4804]: I0217 13:45:50.694392 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kv9q\" (UniqueName: \"kubernetes.io/projected/6c3b824f-ae3d-4681-8b14-16099a2643d5-kube-api-access-4kv9q\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:51 crc kubenswrapper[4804]: I0217 13:45:51.077292 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-c8wmz" event={"ID":"6c3b824f-ae3d-4681-8b14-16099a2643d5","Type":"ContainerDied","Data":"756263fa82eb7a6414b2f41ee32414c158acfb11d5c99ff9536bacfe978bdb28"} Feb 17 13:45:51 crc kubenswrapper[4804]: I0217 13:45:51.077545 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="756263fa82eb7a6414b2f41ee32414c158acfb11d5c99ff9536bacfe978bdb28" Feb 17 13:45:51 crc kubenswrapper[4804]: I0217 13:45:51.077485 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-c8wmz" Feb 17 13:45:53 crc kubenswrapper[4804]: I0217 13:45:53.100132 4804 generic.go:334] "Generic (PLEG): container finished" podID="41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2" containerID="acbd8ddba5d51200f8256011420ddf0cc657b7bccf8bce2bdfa4bb2a827a329a" exitCode=0 Feb 17 13:45:53 crc kubenswrapper[4804]: I0217 13:45:53.100274 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mv8w5" event={"ID":"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2","Type":"ContainerDied","Data":"acbd8ddba5d51200f8256011420ddf0cc657b7bccf8bce2bdfa4bb2a827a329a"} Feb 17 13:45:53 crc kubenswrapper[4804]: I0217 13:45:53.544820 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-etc-swift\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:53 crc kubenswrapper[4804]: I0217 13:45:53.563336 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/90da6e89-6033-4e42-a5ca-bed1a5ad6a46-etc-swift\") pod \"swift-storage-0\" (UID: \"90da6e89-6033-4e42-a5ca-bed1a5ad6a46\") " pod="openstack/swift-storage-0" Feb 17 13:45:53 crc kubenswrapper[4804]: I0217 13:45:53.839900 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.408330 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-rzcfd" podUID="9c049787-03d2-4679-8705-ec2cd1ad8141" containerName="ovn-controller" probeResult="failure" output=< Feb 17 13:45:54 crc kubenswrapper[4804]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 17 13:45:54 crc kubenswrapper[4804]: > Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.460802 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.463783 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-p4wrm" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.698106 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-rzcfd-config-vlgpc"] Feb 17 13:45:54 crc kubenswrapper[4804]: E0217 13:45:54.703442 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c3b824f-ae3d-4681-8b14-16099a2643d5" containerName="mariadb-account-create-update" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.703599 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c3b824f-ae3d-4681-8b14-16099a2643d5" containerName="mariadb-account-create-update" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.703812 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c3b824f-ae3d-4681-8b14-16099a2643d5" containerName="mariadb-account-create-update" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.704495 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.713593 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.758969 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rzcfd-config-vlgpc"] Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.872253 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5bce6d8f-9e27-4d98-8003-f5e7b368f816-var-log-ovn\") pod \"ovn-controller-rzcfd-config-vlgpc\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.872356 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5bce6d8f-9e27-4d98-8003-f5e7b368f816-additional-scripts\") pod \"ovn-controller-rzcfd-config-vlgpc\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.872399 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgdkn\" (UniqueName: \"kubernetes.io/projected/5bce6d8f-9e27-4d98-8003-f5e7b368f816-kube-api-access-qgdkn\") pod \"ovn-controller-rzcfd-config-vlgpc\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.872457 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5bce6d8f-9e27-4d98-8003-f5e7b368f816-var-run-ovn\") pod \"ovn-controller-rzcfd-config-vlgpc\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.872665 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5bce6d8f-9e27-4d98-8003-f5e7b368f816-var-run\") pod \"ovn-controller-rzcfd-config-vlgpc\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.872761 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bce6d8f-9e27-4d98-8003-f5e7b368f816-scripts\") pod \"ovn-controller-rzcfd-config-vlgpc\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.975249 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bce6d8f-9e27-4d98-8003-f5e7b368f816-scripts\") pod \"ovn-controller-rzcfd-config-vlgpc\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.975390 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5bce6d8f-9e27-4d98-8003-f5e7b368f816-var-log-ovn\") pod \"ovn-controller-rzcfd-config-vlgpc\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.975436 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5bce6d8f-9e27-4d98-8003-f5e7b368f816-additional-scripts\") pod \"ovn-controller-rzcfd-config-vlgpc\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.975466 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgdkn\" (UniqueName: \"kubernetes.io/projected/5bce6d8f-9e27-4d98-8003-f5e7b368f816-kube-api-access-qgdkn\") pod \"ovn-controller-rzcfd-config-vlgpc\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.975532 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5bce6d8f-9e27-4d98-8003-f5e7b368f816-var-run-ovn\") pod \"ovn-controller-rzcfd-config-vlgpc\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.975578 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5bce6d8f-9e27-4d98-8003-f5e7b368f816-var-run\") pod \"ovn-controller-rzcfd-config-vlgpc\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.975923 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5bce6d8f-9e27-4d98-8003-f5e7b368f816-var-log-ovn\") pod \"ovn-controller-rzcfd-config-vlgpc\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.975954 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5bce6d8f-9e27-4d98-8003-f5e7b368f816-var-run-ovn\") pod \"ovn-controller-rzcfd-config-vlgpc\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.976037 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5bce6d8f-9e27-4d98-8003-f5e7b368f816-var-run\") pod \"ovn-controller-rzcfd-config-vlgpc\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.977018 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5bce6d8f-9e27-4d98-8003-f5e7b368f816-additional-scripts\") pod \"ovn-controller-rzcfd-config-vlgpc\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:45:54 crc kubenswrapper[4804]: I0217 13:45:54.978728 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bce6d8f-9e27-4d98-8003-f5e7b368f816-scripts\") pod \"ovn-controller-rzcfd-config-vlgpc\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:45:55 crc kubenswrapper[4804]: I0217 13:45:55.002151 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgdkn\" (UniqueName: \"kubernetes.io/projected/5bce6d8f-9e27-4d98-8003-f5e7b368f816-kube-api-access-qgdkn\") pod \"ovn-controller-rzcfd-config-vlgpc\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:45:55 crc kubenswrapper[4804]: I0217 13:45:55.032432 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:45:55 crc kubenswrapper[4804]: I0217 13:45:55.835589 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:45:55 crc kubenswrapper[4804]: I0217 13:45:55.835933 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:45:56 crc kubenswrapper[4804]: I0217 13:45:56.129260 4804 generic.go:334] "Generic (PLEG): container finished" podID="7705a06d-bc27-4686-9ca4-4aae248ead07" containerID="762e45a264eaebaf3d8f6d5ce0dcb58c8497b5e731fde55858df441e4a039993" exitCode=0 Feb 17 13:45:56 crc kubenswrapper[4804]: I0217 13:45:56.129302 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7705a06d-bc27-4686-9ca4-4aae248ead07","Type":"ContainerDied","Data":"762e45a264eaebaf3d8f6d5ce0dcb58c8497b5e731fde55858df441e4a039993"} Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.581130 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.721368 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-dispersionconf\") pod \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.722046 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85nm9\" (UniqueName: \"kubernetes.io/projected/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-kube-api-access-85nm9\") pod \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.722659 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-ring-data-devices\") pod \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.722774 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-scripts\") pod \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.722870 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-etc-swift\") pod \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.722969 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-swiftconf\") pod \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.723066 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-combined-ca-bundle\") pod \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\" (UID: \"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2\") " Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.723288 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2" (UID: "41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.723645 4804 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.724338 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2" (UID: "41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.727091 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-kube-api-access-85nm9" (OuterVolumeSpecName: "kube-api-access-85nm9") pod "41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2" (UID: "41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2"). InnerVolumeSpecName "kube-api-access-85nm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.736274 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2" (UID: "41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.742978 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-scripts" (OuterVolumeSpecName: "scripts") pod "41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2" (UID: "41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.744886 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2" (UID: "41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.748248 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2" (UID: "41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.824942 4804 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.824987 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.825002 4804 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.825014 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85nm9\" (UniqueName: \"kubernetes.io/projected/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-kube-api-access-85nm9\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.825026 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.825037 4804 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 17 13:45:57 crc kubenswrapper[4804]: I0217 13:45:57.889820 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rzcfd-config-vlgpc"] Feb 17 13:45:57 crc kubenswrapper[4804]: W0217 13:45:57.893667 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5bce6d8f_9e27_4d98_8003_f5e7b368f816.slice/crio-27f2fbab7c5e24fe5afe4faa5acef412e06fa89877469179c920ed1be23faded WatchSource:0}: Error finding container 27f2fbab7c5e24fe5afe4faa5acef412e06fa89877469179c920ed1be23faded: Status 404 returned error can't find the container with id 27f2fbab7c5e24fe5afe4faa5acef412e06fa89877469179c920ed1be23faded Feb 17 13:45:58 crc kubenswrapper[4804]: I0217 13:45:58.067892 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 17 13:45:58 crc kubenswrapper[4804]: W0217 13:45:58.079611 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90da6e89_6033_4e42_a5ca_bed1a5ad6a46.slice/crio-4a6213d89aa08a99f9a0d8e92cb7ba034c19edd77c5b9177dedbd870a0911013 WatchSource:0}: Error finding container 4a6213d89aa08a99f9a0d8e92cb7ba034c19edd77c5b9177dedbd870a0911013: Status 404 returned error can't find the container with id 4a6213d89aa08a99f9a0d8e92cb7ba034c19edd77c5b9177dedbd870a0911013 Feb 17 13:45:58 crc kubenswrapper[4804]: I0217 13:45:58.144821 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"90da6e89-6033-4e42-a5ca-bed1a5ad6a46","Type":"ContainerStarted","Data":"4a6213d89aa08a99f9a0d8e92cb7ba034c19edd77c5b9177dedbd870a0911013"} Feb 17 13:45:58 crc kubenswrapper[4804]: I0217 13:45:58.147293 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lpd9f" event={"ID":"dfb6c8ec-f280-4566-bb37-b286119956b5","Type":"ContainerStarted","Data":"95b7f32cb6985d65e04882b6a57442ea7ebdd3da00100304c3a217e8d0730df3"} Feb 17 13:45:58 crc kubenswrapper[4804]: I0217 13:45:58.151519 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rzcfd-config-vlgpc" event={"ID":"5bce6d8f-9e27-4d98-8003-f5e7b368f816","Type":"ContainerStarted","Data":"27f2fbab7c5e24fe5afe4faa5acef412e06fa89877469179c920ed1be23faded"} Feb 17 13:45:58 crc kubenswrapper[4804]: I0217 13:45:58.155332 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mv8w5" event={"ID":"41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2","Type":"ContainerDied","Data":"a563f1cc47989fe64effde3ad5ba60476f9907f0b572451fd849f90b4a6e4fa8"} Feb 17 13:45:58 crc kubenswrapper[4804]: I0217 13:45:58.155354 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mv8w5" Feb 17 13:45:58 crc kubenswrapper[4804]: I0217 13:45:58.155376 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a563f1cc47989fe64effde3ad5ba60476f9907f0b572451fd849f90b4a6e4fa8" Feb 17 13:45:58 crc kubenswrapper[4804]: I0217 13:45:58.157845 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7705a06d-bc27-4686-9ca4-4aae248ead07","Type":"ContainerStarted","Data":"d5fe6af3fa561d599b0a9547e4be296807d6c08d75785f8fe237167c1e1109a3"} Feb 17 13:45:58 crc kubenswrapper[4804]: I0217 13:45:58.158055 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 17 13:45:58 crc kubenswrapper[4804]: I0217 13:45:58.169944 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-lpd9f" podStartSLOduration=2.399252946 podStartE2EDuration="14.169924596s" podCreationTimestamp="2026-02-17 13:45:44 +0000 UTC" firstStartedPulling="2026-02-17 13:45:45.718447209 +0000 UTC m=+1219.829866546" lastFinishedPulling="2026-02-17 13:45:57.489118849 +0000 UTC m=+1231.600538196" observedRunningTime="2026-02-17 13:45:58.167780779 +0000 UTC m=+1232.279200116" watchObservedRunningTime="2026-02-17 13:45:58.169924596 +0000 UTC m=+1232.281343933" Feb 17 13:45:58 crc kubenswrapper[4804]: I0217 13:45:58.193744 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371967.66105 podStartE2EDuration="1m9.193725193s" podCreationTimestamp="2026-02-17 13:44:49 +0000 UTC" firstStartedPulling="2026-02-17 13:44:52.039880363 +0000 UTC m=+1166.151299700" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:45:58.187406514 +0000 UTC m=+1232.298825861" watchObservedRunningTime="2026-02-17 13:45:58.193725193 +0000 UTC m=+1232.305144530" Feb 17 13:45:59 crc kubenswrapper[4804]: I0217 13:45:59.168015 4804 generic.go:334] "Generic (PLEG): container finished" podID="5bce6d8f-9e27-4d98-8003-f5e7b368f816" containerID="c204297abebd9a53145ab03c24cc8848ddb7478ea7164daa834f5efc7f82083d" exitCode=0 Feb 17 13:45:59 crc kubenswrapper[4804]: I0217 13:45:59.168121 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rzcfd-config-vlgpc" event={"ID":"5bce6d8f-9e27-4d98-8003-f5e7b368f816","Type":"ContainerDied","Data":"c204297abebd9a53145ab03c24cc8848ddb7478ea7164daa834f5efc7f82083d"} Feb 17 13:45:59 crc kubenswrapper[4804]: I0217 13:45:59.534859 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-rzcfd" Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.178984 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"90da6e89-6033-4e42-a5ca-bed1a5ad6a46","Type":"ContainerStarted","Data":"5e0418dbdb94699ad2d329e623c23318e9ad1a1365dbd4cfbe0edb34b9c66c02"} Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.179365 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"90da6e89-6033-4e42-a5ca-bed1a5ad6a46","Type":"ContainerStarted","Data":"b6f4a6092962720efac5162bdeaa71212bcf64d18f13f7730dc09cfc8dd63143"} Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.179381 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"90da6e89-6033-4e42-a5ca-bed1a5ad6a46","Type":"ContainerStarted","Data":"fd27f69001d22a9707a7ed63ebbfea2eb121d3fa41a19780099c31844ef8b617"} Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.179393 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"90da6e89-6033-4e42-a5ca-bed1a5ad6a46","Type":"ContainerStarted","Data":"f6f8139dd06cc10cc071d3c4100e7a1764d859000d5a92c2c2bfd6734820d392"} Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.411606 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.474903 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5bce6d8f-9e27-4d98-8003-f5e7b368f816-var-run-ovn\") pod \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.474997 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bce6d8f-9e27-4d98-8003-f5e7b368f816-scripts\") pod \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.475040 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5bce6d8f-9e27-4d98-8003-f5e7b368f816-var-run\") pod \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.475092 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5bce6d8f-9e27-4d98-8003-f5e7b368f816-var-log-ovn\") pod \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.474997 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5bce6d8f-9e27-4d98-8003-f5e7b368f816-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "5bce6d8f-9e27-4d98-8003-f5e7b368f816" (UID: "5bce6d8f-9e27-4d98-8003-f5e7b368f816"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.475159 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgdkn\" (UniqueName: \"kubernetes.io/projected/5bce6d8f-9e27-4d98-8003-f5e7b368f816-kube-api-access-qgdkn\") pod \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.475242 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5bce6d8f-9e27-4d98-8003-f5e7b368f816-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "5bce6d8f-9e27-4d98-8003-f5e7b368f816" (UID: "5bce6d8f-9e27-4d98-8003-f5e7b368f816"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.475265 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5bce6d8f-9e27-4d98-8003-f5e7b368f816-additional-scripts\") pod \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\" (UID: \"5bce6d8f-9e27-4d98-8003-f5e7b368f816\") " Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.475189 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5bce6d8f-9e27-4d98-8003-f5e7b368f816-var-run" (OuterVolumeSpecName: "var-run") pod "5bce6d8f-9e27-4d98-8003-f5e7b368f816" (UID: "5bce6d8f-9e27-4d98-8003-f5e7b368f816"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.475660 4804 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5bce6d8f-9e27-4d98-8003-f5e7b368f816-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.475673 4804 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5bce6d8f-9e27-4d98-8003-f5e7b368f816-var-run\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.475681 4804 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5bce6d8f-9e27-4d98-8003-f5e7b368f816-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.475894 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bce6d8f-9e27-4d98-8003-f5e7b368f816-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "5bce6d8f-9e27-4d98-8003-f5e7b368f816" (UID: "5bce6d8f-9e27-4d98-8003-f5e7b368f816"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.476009 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bce6d8f-9e27-4d98-8003-f5e7b368f816-scripts" (OuterVolumeSpecName: "scripts") pod "5bce6d8f-9e27-4d98-8003-f5e7b368f816" (UID: "5bce6d8f-9e27-4d98-8003-f5e7b368f816"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.479679 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bce6d8f-9e27-4d98-8003-f5e7b368f816-kube-api-access-qgdkn" (OuterVolumeSpecName: "kube-api-access-qgdkn") pod "5bce6d8f-9e27-4d98-8003-f5e7b368f816" (UID: "5bce6d8f-9e27-4d98-8003-f5e7b368f816"). InnerVolumeSpecName "kube-api-access-qgdkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.577060 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgdkn\" (UniqueName: \"kubernetes.io/projected/5bce6d8f-9e27-4d98-8003-f5e7b368f816-kube-api-access-qgdkn\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.577098 4804 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5bce6d8f-9e27-4d98-8003-f5e7b368f816-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:00 crc kubenswrapper[4804]: I0217 13:46:00.577111 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bce6d8f-9e27-4d98-8003-f5e7b368f816-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.192885 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"90da6e89-6033-4e42-a5ca-bed1a5ad6a46","Type":"ContainerStarted","Data":"f9de6394a041e5bc5915e2c672f2446d62b1bebdb25675d7caab649a201be2b3"} Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.196197 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rzcfd-config-vlgpc" event={"ID":"5bce6d8f-9e27-4d98-8003-f5e7b368f816","Type":"ContainerDied","Data":"27f2fbab7c5e24fe5afe4faa5acef412e06fa89877469179c920ed1be23faded"} Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.196260 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27f2fbab7c5e24fe5afe4faa5acef412e06fa89877469179c920ed1be23faded" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.196286 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rzcfd-config-vlgpc" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.561317 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-rzcfd-config-vlgpc"] Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.572655 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-rzcfd-config-vlgpc"] Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.628384 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.663249 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-rzcfd-config-2z52q"] Feb 17 13:46:01 crc kubenswrapper[4804]: E0217 13:46:01.663739 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2" containerName="swift-ring-rebalance" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.663763 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2" containerName="swift-ring-rebalance" Feb 17 13:46:01 crc kubenswrapper[4804]: E0217 13:46:01.663802 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bce6d8f-9e27-4d98-8003-f5e7b368f816" containerName="ovn-config" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.663812 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bce6d8f-9e27-4d98-8003-f5e7b368f816" containerName="ovn-config" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.664013 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bce6d8f-9e27-4d98-8003-f5e7b368f816" containerName="ovn-config" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.664041 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2" containerName="swift-ring-rebalance" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.666303 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.669378 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.703068 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rzcfd-config-2z52q"] Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.802258 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/320d7daa-75d5-47da-8895-e49aa4bdbd01-additional-scripts\") pod \"ovn-controller-rzcfd-config-2z52q\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.802316 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/320d7daa-75d5-47da-8895-e49aa4bdbd01-scripts\") pod \"ovn-controller-rzcfd-config-2z52q\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.802346 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/320d7daa-75d5-47da-8895-e49aa4bdbd01-var-run-ovn\") pod \"ovn-controller-rzcfd-config-2z52q\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.802411 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/320d7daa-75d5-47da-8895-e49aa4bdbd01-var-log-ovn\") pod \"ovn-controller-rzcfd-config-2z52q\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.802472 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bk2z\" (UniqueName: \"kubernetes.io/projected/320d7daa-75d5-47da-8895-e49aa4bdbd01-kube-api-access-9bk2z\") pod \"ovn-controller-rzcfd-config-2z52q\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.802518 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/320d7daa-75d5-47da-8895-e49aa4bdbd01-var-run\") pod \"ovn-controller-rzcfd-config-2z52q\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.903996 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/320d7daa-75d5-47da-8895-e49aa4bdbd01-var-log-ovn\") pod \"ovn-controller-rzcfd-config-2z52q\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.904082 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bk2z\" (UniqueName: \"kubernetes.io/projected/320d7daa-75d5-47da-8895-e49aa4bdbd01-kube-api-access-9bk2z\") pod \"ovn-controller-rzcfd-config-2z52q\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.904122 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/320d7daa-75d5-47da-8895-e49aa4bdbd01-var-run\") pod \"ovn-controller-rzcfd-config-2z52q\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.904179 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/320d7daa-75d5-47da-8895-e49aa4bdbd01-additional-scripts\") pod \"ovn-controller-rzcfd-config-2z52q\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.904218 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/320d7daa-75d5-47da-8895-e49aa4bdbd01-scripts\") pod \"ovn-controller-rzcfd-config-2z52q\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.904236 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/320d7daa-75d5-47da-8895-e49aa4bdbd01-var-run-ovn\") pod \"ovn-controller-rzcfd-config-2z52q\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.904488 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/320d7daa-75d5-47da-8895-e49aa4bdbd01-var-run-ovn\") pod \"ovn-controller-rzcfd-config-2z52q\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.904544 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/320d7daa-75d5-47da-8895-e49aa4bdbd01-var-log-ovn\") pod \"ovn-controller-rzcfd-config-2z52q\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.904859 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/320d7daa-75d5-47da-8895-e49aa4bdbd01-var-run\") pod \"ovn-controller-rzcfd-config-2z52q\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.905482 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/320d7daa-75d5-47da-8895-e49aa4bdbd01-additional-scripts\") pod \"ovn-controller-rzcfd-config-2z52q\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.913923 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/320d7daa-75d5-47da-8895-e49aa4bdbd01-scripts\") pod \"ovn-controller-rzcfd-config-2z52q\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.925806 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bk2z\" (UniqueName: \"kubernetes.io/projected/320d7daa-75d5-47da-8895-e49aa4bdbd01-kube-api-access-9bk2z\") pod \"ovn-controller-rzcfd-config-2z52q\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:01 crc kubenswrapper[4804]: I0217 13:46:01.987225 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:02 crc kubenswrapper[4804]: I0217 13:46:02.212890 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"90da6e89-6033-4e42-a5ca-bed1a5ad6a46","Type":"ContainerStarted","Data":"686e2cfe93aa9073aa5c053faedb8c10fef0f4ced630f109a9553ace832d1d80"} Feb 17 13:46:02 crc kubenswrapper[4804]: I0217 13:46:02.213307 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"90da6e89-6033-4e42-a5ca-bed1a5ad6a46","Type":"ContainerStarted","Data":"1da734f3e094b44514d0b3c69a6a00f8a1495e89176bb397c8fe57eeacf4ad38"} Feb 17 13:46:02 crc kubenswrapper[4804]: I0217 13:46:02.485146 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rzcfd-config-2z52q"] Feb 17 13:46:02 crc kubenswrapper[4804]: I0217 13:46:02.588621 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bce6d8f-9e27-4d98-8003-f5e7b368f816" path="/var/lib/kubelet/pods/5bce6d8f-9e27-4d98-8003-f5e7b368f816/volumes" Feb 17 13:46:03 crc kubenswrapper[4804]: I0217 13:46:03.221890 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rzcfd-config-2z52q" event={"ID":"320d7daa-75d5-47da-8895-e49aa4bdbd01","Type":"ContainerStarted","Data":"68ee1b720b46b359f54d44cc86dbc4b93cbd1ca1b6ba26986e9fe10248b66933"} Feb 17 13:46:04 crc kubenswrapper[4804]: I0217 13:46:04.240792 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rzcfd-config-2z52q" event={"ID":"320d7daa-75d5-47da-8895-e49aa4bdbd01","Type":"ContainerStarted","Data":"525c6762f9ba8180d2f6b437538441d8677513d8b708766e65a25901daeb816c"} Feb 17 13:46:05 crc kubenswrapper[4804]: I0217 13:46:05.251750 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"90da6e89-6033-4e42-a5ca-bed1a5ad6a46","Type":"ContainerStarted","Data":"7bd65056ded46a7f415c5114a632dee5a15e7d850a90a8b84394e91556a340bc"} Feb 17 13:46:05 crc kubenswrapper[4804]: I0217 13:46:05.254527 4804 generic.go:334] "Generic (PLEG): container finished" podID="320d7daa-75d5-47da-8895-e49aa4bdbd01" containerID="525c6762f9ba8180d2f6b437538441d8677513d8b708766e65a25901daeb816c" exitCode=0 Feb 17 13:46:05 crc kubenswrapper[4804]: I0217 13:46:05.254561 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rzcfd-config-2z52q" event={"ID":"320d7daa-75d5-47da-8895-e49aa4bdbd01","Type":"ContainerDied","Data":"525c6762f9ba8180d2f6b437538441d8677513d8b708766e65a25901daeb816c"} Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.273595 4804 generic.go:334] "Generic (PLEG): container finished" podID="dfb6c8ec-f280-4566-bb37-b286119956b5" containerID="95b7f32cb6985d65e04882b6a57442ea7ebdd3da00100304c3a217e8d0730df3" exitCode=0 Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.273814 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lpd9f" event={"ID":"dfb6c8ec-f280-4566-bb37-b286119956b5","Type":"ContainerDied","Data":"95b7f32cb6985d65e04882b6a57442ea7ebdd3da00100304c3a217e8d0730df3"} Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.294173 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"90da6e89-6033-4e42-a5ca-bed1a5ad6a46","Type":"ContainerStarted","Data":"22f6ed363823056fbc0fd4a7c4cfe3d602bfcf8bfc5e65a12a3cde9a7e4b9bc6"} Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.294234 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"90da6e89-6033-4e42-a5ca-bed1a5ad6a46","Type":"ContainerStarted","Data":"f202da3b35bdf2ec3ead0862da3a8cbdee11edb5799c47d9b747d39fabd758c1"} Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.294247 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"90da6e89-6033-4e42-a5ca-bed1a5ad6a46","Type":"ContainerStarted","Data":"0cb79acfd998f6c023daac6db3e211de72172d9ea7ee223aeab1f68d231801de"} Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.294257 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"90da6e89-6033-4e42-a5ca-bed1a5ad6a46","Type":"ContainerStarted","Data":"93bebab89c543415262fbcca547ec300f432db9d85f2f1f08f1bb4d91eaa9893"} Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.294267 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"90da6e89-6033-4e42-a5ca-bed1a5ad6a46","Type":"ContainerStarted","Data":"fe1ecd36edd66ab9c042fcd2efeb8282870b6925b9fa54394bcdfd94212a4bdb"} Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.294277 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"90da6e89-6033-4e42-a5ca-bed1a5ad6a46","Type":"ContainerStarted","Data":"91e9eadda5958c02bad315ad2fa1dc5c0fb9327307643e907256269aea3a4d1f"} Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.640996 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.681785 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/320d7daa-75d5-47da-8895-e49aa4bdbd01-additional-scripts\") pod \"320d7daa-75d5-47da-8895-e49aa4bdbd01\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.681936 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/320d7daa-75d5-47da-8895-e49aa4bdbd01-var-run-ovn\") pod \"320d7daa-75d5-47da-8895-e49aa4bdbd01\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.681978 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/320d7daa-75d5-47da-8895-e49aa4bdbd01-var-run\") pod \"320d7daa-75d5-47da-8895-e49aa4bdbd01\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.682067 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bk2z\" (UniqueName: \"kubernetes.io/projected/320d7daa-75d5-47da-8895-e49aa4bdbd01-kube-api-access-9bk2z\") pod \"320d7daa-75d5-47da-8895-e49aa4bdbd01\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.682238 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/320d7daa-75d5-47da-8895-e49aa4bdbd01-scripts\") pod \"320d7daa-75d5-47da-8895-e49aa4bdbd01\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.682310 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/320d7daa-75d5-47da-8895-e49aa4bdbd01-var-log-ovn\") pod \"320d7daa-75d5-47da-8895-e49aa4bdbd01\" (UID: \"320d7daa-75d5-47da-8895-e49aa4bdbd01\") " Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.682557 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/320d7daa-75d5-47da-8895-e49aa4bdbd01-var-run" (OuterVolumeSpecName: "var-run") pod "320d7daa-75d5-47da-8895-e49aa4bdbd01" (UID: "320d7daa-75d5-47da-8895-e49aa4bdbd01"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.682619 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/320d7daa-75d5-47da-8895-e49aa4bdbd01-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "320d7daa-75d5-47da-8895-e49aa4bdbd01" (UID: "320d7daa-75d5-47da-8895-e49aa4bdbd01"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.683056 4804 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/320d7daa-75d5-47da-8895-e49aa4bdbd01-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.683088 4804 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/320d7daa-75d5-47da-8895-e49aa4bdbd01-var-run\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.683088 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/320d7daa-75d5-47da-8895-e49aa4bdbd01-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "320d7daa-75d5-47da-8895-e49aa4bdbd01" (UID: "320d7daa-75d5-47da-8895-e49aa4bdbd01"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.683666 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/320d7daa-75d5-47da-8895-e49aa4bdbd01-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "320d7daa-75d5-47da-8895-e49aa4bdbd01" (UID: "320d7daa-75d5-47da-8895-e49aa4bdbd01"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.684686 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/320d7daa-75d5-47da-8895-e49aa4bdbd01-scripts" (OuterVolumeSpecName: "scripts") pod "320d7daa-75d5-47da-8895-e49aa4bdbd01" (UID: "320d7daa-75d5-47da-8895-e49aa4bdbd01"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.686711 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/320d7daa-75d5-47da-8895-e49aa4bdbd01-kube-api-access-9bk2z" (OuterVolumeSpecName: "kube-api-access-9bk2z") pod "320d7daa-75d5-47da-8895-e49aa4bdbd01" (UID: "320d7daa-75d5-47da-8895-e49aa4bdbd01"). InnerVolumeSpecName "kube-api-access-9bk2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.784831 4804 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/320d7daa-75d5-47da-8895-e49aa4bdbd01-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.784863 4804 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/320d7daa-75d5-47da-8895-e49aa4bdbd01-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.784878 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bk2z\" (UniqueName: \"kubernetes.io/projected/320d7daa-75d5-47da-8895-e49aa4bdbd01-kube-api-access-9bk2z\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:06 crc kubenswrapper[4804]: I0217 13:46:06.784889 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/320d7daa-75d5-47da-8895-e49aa4bdbd01-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.309234 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"90da6e89-6033-4e42-a5ca-bed1a5ad6a46","Type":"ContainerStarted","Data":"bb4ff7d1735d0f3e000d70765610bd98df2ee0015c339b7c178e631d7b1ad325"} Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.310827 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rzcfd-config-2z52q" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.310833 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rzcfd-config-2z52q" event={"ID":"320d7daa-75d5-47da-8895-e49aa4bdbd01","Type":"ContainerDied","Data":"68ee1b720b46b359f54d44cc86dbc4b93cbd1ca1b6ba26986e9fe10248b66933"} Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.310877 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68ee1b720b46b359f54d44cc86dbc4b93cbd1ca1b6ba26986e9fe10248b66933" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.344844 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=24.254387256 podStartE2EDuration="31.344828247s" podCreationTimestamp="2026-02-17 13:45:36 +0000 UTC" firstStartedPulling="2026-02-17 13:45:58.086506392 +0000 UTC m=+1232.197925729" lastFinishedPulling="2026-02-17 13:46:05.176947383 +0000 UTC m=+1239.288366720" observedRunningTime="2026-02-17 13:46:07.342859175 +0000 UTC m=+1241.454278512" watchObservedRunningTime="2026-02-17 13:46:07.344828247 +0000 UTC m=+1241.456247584" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.636907 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-m4fkl"] Feb 17 13:46:07 crc kubenswrapper[4804]: E0217 13:46:07.637541 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="320d7daa-75d5-47da-8895-e49aa4bdbd01" containerName="ovn-config" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.637557 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="320d7daa-75d5-47da-8895-e49aa4bdbd01" containerName="ovn-config" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.637696 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="320d7daa-75d5-47da-8895-e49aa4bdbd01" containerName="ovn-config" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.638476 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.640716 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.645605 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-m4fkl"] Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.700892 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-config\") pod \"dnsmasq-dns-5c79d794d7-m4fkl\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.701015 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-m4fkl\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.701144 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-m4fkl\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.701177 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-m4fkl\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.701271 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vplc\" (UniqueName: \"kubernetes.io/projected/e1f0a7c0-6169-479c-ac5c-9a30f7619603-kube-api-access-7vplc\") pod \"dnsmasq-dns-5c79d794d7-m4fkl\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.701293 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-m4fkl\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.721005 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-rzcfd-config-2z52q"] Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.727674 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-rzcfd-config-2z52q"] Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.785993 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lpd9f" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.802449 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vplc\" (UniqueName: \"kubernetes.io/projected/e1f0a7c0-6169-479c-ac5c-9a30f7619603-kube-api-access-7vplc\") pod \"dnsmasq-dns-5c79d794d7-m4fkl\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.802497 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-m4fkl\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.802545 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-config\") pod \"dnsmasq-dns-5c79d794d7-m4fkl\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.802612 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-m4fkl\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.802642 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-m4fkl\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.802658 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-m4fkl\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.803438 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-m4fkl\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.803478 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-config\") pod \"dnsmasq-dns-5c79d794d7-m4fkl\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.803482 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-m4fkl\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.803607 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-m4fkl\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.804034 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-m4fkl\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.832442 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vplc\" (UniqueName: \"kubernetes.io/projected/e1f0a7c0-6169-479c-ac5c-9a30f7619603-kube-api-access-7vplc\") pod \"dnsmasq-dns-5c79d794d7-m4fkl\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.904281 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfb6c8ec-f280-4566-bb37-b286119956b5-combined-ca-bundle\") pod \"dfb6c8ec-f280-4566-bb37-b286119956b5\" (UID: \"dfb6c8ec-f280-4566-bb37-b286119956b5\") " Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.904437 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfb6c8ec-f280-4566-bb37-b286119956b5-config-data\") pod \"dfb6c8ec-f280-4566-bb37-b286119956b5\" (UID: \"dfb6c8ec-f280-4566-bb37-b286119956b5\") " Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.904565 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dfb6c8ec-f280-4566-bb37-b286119956b5-db-sync-config-data\") pod \"dfb6c8ec-f280-4566-bb37-b286119956b5\" (UID: \"dfb6c8ec-f280-4566-bb37-b286119956b5\") " Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.904591 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkcbn\" (UniqueName: \"kubernetes.io/projected/dfb6c8ec-f280-4566-bb37-b286119956b5-kube-api-access-zkcbn\") pod \"dfb6c8ec-f280-4566-bb37-b286119956b5\" (UID: \"dfb6c8ec-f280-4566-bb37-b286119956b5\") " Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.910120 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfb6c8ec-f280-4566-bb37-b286119956b5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "dfb6c8ec-f280-4566-bb37-b286119956b5" (UID: "dfb6c8ec-f280-4566-bb37-b286119956b5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.910931 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfb6c8ec-f280-4566-bb37-b286119956b5-kube-api-access-zkcbn" (OuterVolumeSpecName: "kube-api-access-zkcbn") pod "dfb6c8ec-f280-4566-bb37-b286119956b5" (UID: "dfb6c8ec-f280-4566-bb37-b286119956b5"). InnerVolumeSpecName "kube-api-access-zkcbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.935704 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfb6c8ec-f280-4566-bb37-b286119956b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dfb6c8ec-f280-4566-bb37-b286119956b5" (UID: "dfb6c8ec-f280-4566-bb37-b286119956b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.945913 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfb6c8ec-f280-4566-bb37-b286119956b5-config-data" (OuterVolumeSpecName: "config-data") pod "dfb6c8ec-f280-4566-bb37-b286119956b5" (UID: "dfb6c8ec-f280-4566-bb37-b286119956b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:07 crc kubenswrapper[4804]: I0217 13:46:07.964597 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.006306 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfb6c8ec-f280-4566-bb37-b286119956b5-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.006341 4804 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dfb6c8ec-f280-4566-bb37-b286119956b5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.006354 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkcbn\" (UniqueName: \"kubernetes.io/projected/dfb6c8ec-f280-4566-bb37-b286119956b5-kube-api-access-zkcbn\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.006367 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfb6c8ec-f280-4566-bb37-b286119956b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.326891 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lpd9f" event={"ID":"dfb6c8ec-f280-4566-bb37-b286119956b5","Type":"ContainerDied","Data":"8b44ab8e952e1f3a0a808b705734302f9d0fba531b5c5b5df0002ed2ff150b18"} Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.327291 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b44ab8e952e1f3a0a808b705734302f9d0fba531b5c5b5df0002ed2ff150b18" Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.327007 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lpd9f" Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.423668 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-m4fkl"] Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.612673 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="320d7daa-75d5-47da-8895-e49aa4bdbd01" path="/var/lib/kubelet/pods/320d7daa-75d5-47da-8895-e49aa4bdbd01/volumes" Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.765322 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-m4fkl"] Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.848807 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-hp2db"] Feb 17 13:46:08 crc kubenswrapper[4804]: E0217 13:46:08.849247 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfb6c8ec-f280-4566-bb37-b286119956b5" containerName="glance-db-sync" Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.849273 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfb6c8ec-f280-4566-bb37-b286119956b5" containerName="glance-db-sync" Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.849483 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfb6c8ec-f280-4566-bb37-b286119956b5" containerName="glance-db-sync" Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.850727 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.870674 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-hp2db"] Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.938847 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-config\") pod \"dnsmasq-dns-5f59b8f679-hp2db\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.938881 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-hp2db\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.938922 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-hp2db\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.938945 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-hp2db\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.939177 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg5rm\" (UniqueName: \"kubernetes.io/projected/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-kube-api-access-gg5rm\") pod \"dnsmasq-dns-5f59b8f679-hp2db\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:08 crc kubenswrapper[4804]: I0217 13:46:08.939264 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-hp2db\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.040673 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg5rm\" (UniqueName: \"kubernetes.io/projected/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-kube-api-access-gg5rm\") pod \"dnsmasq-dns-5f59b8f679-hp2db\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.040717 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-hp2db\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.040791 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-config\") pod \"dnsmasq-dns-5f59b8f679-hp2db\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.040809 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-hp2db\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.040841 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-hp2db\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.040864 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-hp2db\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.041767 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-hp2db\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.041868 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-config\") pod \"dnsmasq-dns-5f59b8f679-hp2db\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.041877 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-hp2db\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.042096 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-hp2db\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.042812 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-hp2db\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.070173 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg5rm\" (UniqueName: \"kubernetes.io/projected/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-kube-api-access-gg5rm\") pod \"dnsmasq-dns-5f59b8f679-hp2db\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.168463 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.336548 4804 generic.go:334] "Generic (PLEG): container finished" podID="e1f0a7c0-6169-479c-ac5c-9a30f7619603" containerID="e097e65733863a0ea477698b59924cd597cb6c636bea03eec20a7fcebf703c21" exitCode=0 Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.336701 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" event={"ID":"e1f0a7c0-6169-479c-ac5c-9a30f7619603","Type":"ContainerDied","Data":"e097e65733863a0ea477698b59924cd597cb6c636bea03eec20a7fcebf703c21"} Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.336867 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" event={"ID":"e1f0a7c0-6169-479c-ac5c-9a30f7619603","Type":"ContainerStarted","Data":"1a11ba1d0a8c306ea4b2a4f940ad10c27214aa9a1c4f3dea92204634530ef96a"} Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.618080 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-hp2db"] Feb 17 13:46:09 crc kubenswrapper[4804]: W0217 13:46:09.618337 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd29250cb_6c2b_4994_ba6f_f3b7239ec3e2.slice/crio-9e9b5616dd62b1afbb31c7b84604c193960d32aba2443b3713adaf3e69d9332f WatchSource:0}: Error finding container 9e9b5616dd62b1afbb31c7b84604c193960d32aba2443b3713adaf3e69d9332f: Status 404 returned error can't find the container with id 9e9b5616dd62b1afbb31c7b84604c193960d32aba2443b3713adaf3e69d9332f Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.761500 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.856775 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-config\") pod \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.856888 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-dns-swift-storage-0\") pod \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.856917 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-ovsdbserver-nb\") pod \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.856982 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vplc\" (UniqueName: \"kubernetes.io/projected/e1f0a7c0-6169-479c-ac5c-9a30f7619603-kube-api-access-7vplc\") pod \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.857006 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-ovsdbserver-sb\") pod \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.857104 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-dns-svc\") pod \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\" (UID: \"e1f0a7c0-6169-479c-ac5c-9a30f7619603\") " Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.861681 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1f0a7c0-6169-479c-ac5c-9a30f7619603-kube-api-access-7vplc" (OuterVolumeSpecName: "kube-api-access-7vplc") pod "e1f0a7c0-6169-479c-ac5c-9a30f7619603" (UID: "e1f0a7c0-6169-479c-ac5c-9a30f7619603"). InnerVolumeSpecName "kube-api-access-7vplc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.878450 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e1f0a7c0-6169-479c-ac5c-9a30f7619603" (UID: "e1f0a7c0-6169-479c-ac5c-9a30f7619603"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.879387 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e1f0a7c0-6169-479c-ac5c-9a30f7619603" (UID: "e1f0a7c0-6169-479c-ac5c-9a30f7619603"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.880012 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-config" (OuterVolumeSpecName: "config") pod "e1f0a7c0-6169-479c-ac5c-9a30f7619603" (UID: "e1f0a7c0-6169-479c-ac5c-9a30f7619603"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.881347 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e1f0a7c0-6169-479c-ac5c-9a30f7619603" (UID: "e1f0a7c0-6169-479c-ac5c-9a30f7619603"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.882550 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e1f0a7c0-6169-479c-ac5c-9a30f7619603" (UID: "e1f0a7c0-6169-479c-ac5c-9a30f7619603"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.959577 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.959609 4804 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.959619 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.959630 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.959639 4804 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1f0a7c0-6169-479c-ac5c-9a30f7619603-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:09 crc kubenswrapper[4804]: I0217 13:46:09.959649 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vplc\" (UniqueName: \"kubernetes.io/projected/e1f0a7c0-6169-479c-ac5c-9a30f7619603-kube-api-access-7vplc\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:10 crc kubenswrapper[4804]: I0217 13:46:10.346980 4804 generic.go:334] "Generic (PLEG): container finished" podID="d29250cb-6c2b-4994-ba6f-f3b7239ec3e2" containerID="db5a3b86c0d8b3db5d6271f9217c22ad17bdcc258a7e248a0fd7a959c200bb06" exitCode=0 Feb 17 13:46:10 crc kubenswrapper[4804]: I0217 13:46:10.347043 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" event={"ID":"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2","Type":"ContainerDied","Data":"db5a3b86c0d8b3db5d6271f9217c22ad17bdcc258a7e248a0fd7a959c200bb06"} Feb 17 13:46:10 crc kubenswrapper[4804]: I0217 13:46:10.347094 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" event={"ID":"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2","Type":"ContainerStarted","Data":"9e9b5616dd62b1afbb31c7b84604c193960d32aba2443b3713adaf3e69d9332f"} Feb 17 13:46:10 crc kubenswrapper[4804]: I0217 13:46:10.349346 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" event={"ID":"e1f0a7c0-6169-479c-ac5c-9a30f7619603","Type":"ContainerDied","Data":"1a11ba1d0a8c306ea4b2a4f940ad10c27214aa9a1c4f3dea92204634530ef96a"} Feb 17 13:46:10 crc kubenswrapper[4804]: I0217 13:46:10.349388 4804 scope.go:117] "RemoveContainer" containerID="e097e65733863a0ea477698b59924cd597cb6c636bea03eec20a7fcebf703c21" Feb 17 13:46:10 crc kubenswrapper[4804]: I0217 13:46:10.349465 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-m4fkl" Feb 17 13:46:10 crc kubenswrapper[4804]: I0217 13:46:10.570672 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-m4fkl"] Feb 17 13:46:10 crc kubenswrapper[4804]: I0217 13:46:10.587077 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-m4fkl"] Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.358607 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" event={"ID":"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2","Type":"ContainerStarted","Data":"d82f8e60d688c7c01688fbedc29bdcd643db8c569309612415da050dc9220d5f"} Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.358801 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.385372 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" podStartSLOduration=3.3853435530000002 podStartE2EDuration="3.385343553s" podCreationTimestamp="2026-02-17 13:46:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:46:11.37760546 +0000 UTC m=+1245.489024807" watchObservedRunningTime="2026-02-17 13:46:11.385343553 +0000 UTC m=+1245.496762900" Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.500436 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.800345 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-ncwmc"] Feb 17 13:46:11 crc kubenswrapper[4804]: E0217 13:46:11.801082 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1f0a7c0-6169-479c-ac5c-9a30f7619603" containerName="init" Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.801100 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1f0a7c0-6169-479c-ac5c-9a30f7619603" containerName="init" Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.801314 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1f0a7c0-6169-479c-ac5c-9a30f7619603" containerName="init" Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.801962 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-ncwmc" Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.825974 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-ncwmc"] Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.893335 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4895769c-ef45-40c8-a8ae-0c5cb954dab2-operator-scripts\") pod \"cinder-db-create-ncwmc\" (UID: \"4895769c-ef45-40c8-a8ae-0c5cb954dab2\") " pod="openstack/cinder-db-create-ncwmc" Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.893406 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmllm\" (UniqueName: \"kubernetes.io/projected/4895769c-ef45-40c8-a8ae-0c5cb954dab2-kube-api-access-nmllm\") pod \"cinder-db-create-ncwmc\" (UID: \"4895769c-ef45-40c8-a8ae-0c5cb954dab2\") " pod="openstack/cinder-db-create-ncwmc" Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.903759 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-46zbc"] Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.920361 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-46zbc" Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.927193 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-d59c-account-create-update-phgft"] Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.928316 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-46zbc"] Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.928417 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d59c-account-create-update-phgft" Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.959583 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.972399 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-d59c-account-create-update-phgft"] Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.995042 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35f1cc1f-a736-4c02-9c26-726c0c6f0d59-operator-scripts\") pod \"cinder-d59c-account-create-update-phgft\" (UID: \"35f1cc1f-a736-4c02-9c26-726c0c6f0d59\") " pod="openstack/cinder-d59c-account-create-update-phgft" Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.995092 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt87n\" (UniqueName: \"kubernetes.io/projected/60ee8426-dcbf-4430-8594-68ee778a8bbc-kube-api-access-rt87n\") pod \"barbican-db-create-46zbc\" (UID: \"60ee8426-dcbf-4430-8594-68ee778a8bbc\") " pod="openstack/barbican-db-create-46zbc" Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.995162 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvb4p\" (UniqueName: \"kubernetes.io/projected/35f1cc1f-a736-4c02-9c26-726c0c6f0d59-kube-api-access-wvb4p\") pod \"cinder-d59c-account-create-update-phgft\" (UID: \"35f1cc1f-a736-4c02-9c26-726c0c6f0d59\") " pod="openstack/cinder-d59c-account-create-update-phgft" Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.995217 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4895769c-ef45-40c8-a8ae-0c5cb954dab2-operator-scripts\") pod \"cinder-db-create-ncwmc\" (UID: \"4895769c-ef45-40c8-a8ae-0c5cb954dab2\") " pod="openstack/cinder-db-create-ncwmc" Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.995262 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmllm\" (UniqueName: \"kubernetes.io/projected/4895769c-ef45-40c8-a8ae-0c5cb954dab2-kube-api-access-nmllm\") pod \"cinder-db-create-ncwmc\" (UID: \"4895769c-ef45-40c8-a8ae-0c5cb954dab2\") " pod="openstack/cinder-db-create-ncwmc" Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.995374 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60ee8426-dcbf-4430-8594-68ee778a8bbc-operator-scripts\") pod \"barbican-db-create-46zbc\" (UID: \"60ee8426-dcbf-4430-8594-68ee778a8bbc\") " pod="openstack/barbican-db-create-46zbc" Feb 17 13:46:11 crc kubenswrapper[4804]: I0217 13:46:11.996334 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4895769c-ef45-40c8-a8ae-0c5cb954dab2-operator-scripts\") pod \"cinder-db-create-ncwmc\" (UID: \"4895769c-ef45-40c8-a8ae-0c5cb954dab2\") " pod="openstack/cinder-db-create-ncwmc" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.023892 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmllm\" (UniqueName: \"kubernetes.io/projected/4895769c-ef45-40c8-a8ae-0c5cb954dab2-kube-api-access-nmllm\") pod \"cinder-db-create-ncwmc\" (UID: \"4895769c-ef45-40c8-a8ae-0c5cb954dab2\") " pod="openstack/cinder-db-create-ncwmc" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.096645 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60ee8426-dcbf-4430-8594-68ee778a8bbc-operator-scripts\") pod \"barbican-db-create-46zbc\" (UID: \"60ee8426-dcbf-4430-8594-68ee778a8bbc\") " pod="openstack/barbican-db-create-46zbc" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.096716 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35f1cc1f-a736-4c02-9c26-726c0c6f0d59-operator-scripts\") pod \"cinder-d59c-account-create-update-phgft\" (UID: \"35f1cc1f-a736-4c02-9c26-726c0c6f0d59\") " pod="openstack/cinder-d59c-account-create-update-phgft" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.096747 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt87n\" (UniqueName: \"kubernetes.io/projected/60ee8426-dcbf-4430-8594-68ee778a8bbc-kube-api-access-rt87n\") pod \"barbican-db-create-46zbc\" (UID: \"60ee8426-dcbf-4430-8594-68ee778a8bbc\") " pod="openstack/barbican-db-create-46zbc" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.096811 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvb4p\" (UniqueName: \"kubernetes.io/projected/35f1cc1f-a736-4c02-9c26-726c0c6f0d59-kube-api-access-wvb4p\") pod \"cinder-d59c-account-create-update-phgft\" (UID: \"35f1cc1f-a736-4c02-9c26-726c0c6f0d59\") " pod="openstack/cinder-d59c-account-create-update-phgft" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.098094 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60ee8426-dcbf-4430-8594-68ee778a8bbc-operator-scripts\") pod \"barbican-db-create-46zbc\" (UID: \"60ee8426-dcbf-4430-8594-68ee778a8bbc\") " pod="openstack/barbican-db-create-46zbc" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.098807 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35f1cc1f-a736-4c02-9c26-726c0c6f0d59-operator-scripts\") pod \"cinder-d59c-account-create-update-phgft\" (UID: \"35f1cc1f-a736-4c02-9c26-726c0c6f0d59\") " pod="openstack/cinder-d59c-account-create-update-phgft" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.112025 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-7982-account-create-update-pd5b7"] Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.113308 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-7982-account-create-update-pd5b7" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.116184 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-ncwmc" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.121418 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.122616 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt87n\" (UniqueName: \"kubernetes.io/projected/60ee8426-dcbf-4430-8594-68ee778a8bbc-kube-api-access-rt87n\") pod \"barbican-db-create-46zbc\" (UID: \"60ee8426-dcbf-4430-8594-68ee778a8bbc\") " pod="openstack/barbican-db-create-46zbc" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.123584 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-7982-account-create-update-pd5b7"] Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.124752 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvb4p\" (UniqueName: \"kubernetes.io/projected/35f1cc1f-a736-4c02-9c26-726c0c6f0d59-kube-api-access-wvb4p\") pod \"cinder-d59c-account-create-update-phgft\" (UID: \"35f1cc1f-a736-4c02-9c26-726c0c6f0d59\") " pod="openstack/cinder-d59c-account-create-update-phgft" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.161924 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-dgzbs"] Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.163175 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dgzbs" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.166457 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-2fq28" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.166747 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.166953 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.167849 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.178275 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-dgzbs"] Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.202749 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd9036c7-1cff-4fb8-9af2-90057c4251dc-config-data\") pod \"keystone-db-sync-dgzbs\" (UID: \"fd9036c7-1cff-4fb8-9af2-90057c4251dc\") " pod="openstack/keystone-db-sync-dgzbs" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.202813 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gwpw\" (UniqueName: \"kubernetes.io/projected/e64978ab-e30e-4ebf-bce0-a8e29d5e5adc-kube-api-access-2gwpw\") pod \"barbican-7982-account-create-update-pd5b7\" (UID: \"e64978ab-e30e-4ebf-bce0-a8e29d5e5adc\") " pod="openstack/barbican-7982-account-create-update-pd5b7" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.202871 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn9kw\" (UniqueName: \"kubernetes.io/projected/fd9036c7-1cff-4fb8-9af2-90057c4251dc-kube-api-access-rn9kw\") pod \"keystone-db-sync-dgzbs\" (UID: \"fd9036c7-1cff-4fb8-9af2-90057c4251dc\") " pod="openstack/keystone-db-sync-dgzbs" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.202953 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e64978ab-e30e-4ebf-bce0-a8e29d5e5adc-operator-scripts\") pod \"barbican-7982-account-create-update-pd5b7\" (UID: \"e64978ab-e30e-4ebf-bce0-a8e29d5e5adc\") " pod="openstack/barbican-7982-account-create-update-pd5b7" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.202974 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd9036c7-1cff-4fb8-9af2-90057c4251dc-combined-ca-bundle\") pod \"keystone-db-sync-dgzbs\" (UID: \"fd9036c7-1cff-4fb8-9af2-90057c4251dc\") " pod="openstack/keystone-db-sync-dgzbs" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.226265 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-hdmw5"] Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.227425 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hdmw5" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.246099 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-hdmw5"] Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.284766 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-46zbc" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.292002 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d59c-account-create-update-phgft" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.304769 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn9kw\" (UniqueName: \"kubernetes.io/projected/fd9036c7-1cff-4fb8-9af2-90057c4251dc-kube-api-access-rn9kw\") pod \"keystone-db-sync-dgzbs\" (UID: \"fd9036c7-1cff-4fb8-9af2-90057c4251dc\") " pod="openstack/keystone-db-sync-dgzbs" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.304883 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e64978ab-e30e-4ebf-bce0-a8e29d5e5adc-operator-scripts\") pod \"barbican-7982-account-create-update-pd5b7\" (UID: \"e64978ab-e30e-4ebf-bce0-a8e29d5e5adc\") " pod="openstack/barbican-7982-account-create-update-pd5b7" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.304915 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd9036c7-1cff-4fb8-9af2-90057c4251dc-combined-ca-bundle\") pod \"keystone-db-sync-dgzbs\" (UID: \"fd9036c7-1cff-4fb8-9af2-90057c4251dc\") " pod="openstack/keystone-db-sync-dgzbs" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.304944 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsf6h\" (UniqueName: \"kubernetes.io/projected/e26c9257-7102-4d48-8999-c0a3f0ca4009-kube-api-access-zsf6h\") pod \"neutron-db-create-hdmw5\" (UID: \"e26c9257-7102-4d48-8999-c0a3f0ca4009\") " pod="openstack/neutron-db-create-hdmw5" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.304980 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e26c9257-7102-4d48-8999-c0a3f0ca4009-operator-scripts\") pod \"neutron-db-create-hdmw5\" (UID: \"e26c9257-7102-4d48-8999-c0a3f0ca4009\") " pod="openstack/neutron-db-create-hdmw5" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.305023 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd9036c7-1cff-4fb8-9af2-90057c4251dc-config-data\") pod \"keystone-db-sync-dgzbs\" (UID: \"fd9036c7-1cff-4fb8-9af2-90057c4251dc\") " pod="openstack/keystone-db-sync-dgzbs" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.305814 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gwpw\" (UniqueName: \"kubernetes.io/projected/e64978ab-e30e-4ebf-bce0-a8e29d5e5adc-kube-api-access-2gwpw\") pod \"barbican-7982-account-create-update-pd5b7\" (UID: \"e64978ab-e30e-4ebf-bce0-a8e29d5e5adc\") " pod="openstack/barbican-7982-account-create-update-pd5b7" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.306987 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e64978ab-e30e-4ebf-bce0-a8e29d5e5adc-operator-scripts\") pod \"barbican-7982-account-create-update-pd5b7\" (UID: \"e64978ab-e30e-4ebf-bce0-a8e29d5e5adc\") " pod="openstack/barbican-7982-account-create-update-pd5b7" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.310118 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd9036c7-1cff-4fb8-9af2-90057c4251dc-config-data\") pod \"keystone-db-sync-dgzbs\" (UID: \"fd9036c7-1cff-4fb8-9af2-90057c4251dc\") " pod="openstack/keystone-db-sync-dgzbs" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.312970 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd9036c7-1cff-4fb8-9af2-90057c4251dc-combined-ca-bundle\") pod \"keystone-db-sync-dgzbs\" (UID: \"fd9036c7-1cff-4fb8-9af2-90057c4251dc\") " pod="openstack/keystone-db-sync-dgzbs" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.326049 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gwpw\" (UniqueName: \"kubernetes.io/projected/e64978ab-e30e-4ebf-bce0-a8e29d5e5adc-kube-api-access-2gwpw\") pod \"barbican-7982-account-create-update-pd5b7\" (UID: \"e64978ab-e30e-4ebf-bce0-a8e29d5e5adc\") " pod="openstack/barbican-7982-account-create-update-pd5b7" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.333830 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn9kw\" (UniqueName: \"kubernetes.io/projected/fd9036c7-1cff-4fb8-9af2-90057c4251dc-kube-api-access-rn9kw\") pod \"keystone-db-sync-dgzbs\" (UID: \"fd9036c7-1cff-4fb8-9af2-90057c4251dc\") " pod="openstack/keystone-db-sync-dgzbs" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.408044 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsf6h\" (UniqueName: \"kubernetes.io/projected/e26c9257-7102-4d48-8999-c0a3f0ca4009-kube-api-access-zsf6h\") pod \"neutron-db-create-hdmw5\" (UID: \"e26c9257-7102-4d48-8999-c0a3f0ca4009\") " pod="openstack/neutron-db-create-hdmw5" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.408089 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e26c9257-7102-4d48-8999-c0a3f0ca4009-operator-scripts\") pod \"neutron-db-create-hdmw5\" (UID: \"e26c9257-7102-4d48-8999-c0a3f0ca4009\") " pod="openstack/neutron-db-create-hdmw5" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.408721 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e26c9257-7102-4d48-8999-c0a3f0ca4009-operator-scripts\") pod \"neutron-db-create-hdmw5\" (UID: \"e26c9257-7102-4d48-8999-c0a3f0ca4009\") " pod="openstack/neutron-db-create-hdmw5" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.432966 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsf6h\" (UniqueName: \"kubernetes.io/projected/e26c9257-7102-4d48-8999-c0a3f0ca4009-kube-api-access-zsf6h\") pod \"neutron-db-create-hdmw5\" (UID: \"e26c9257-7102-4d48-8999-c0a3f0ca4009\") " pod="openstack/neutron-db-create-hdmw5" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.508449 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-ncwmc"] Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.521052 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-7982-account-create-update-pd5b7" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.537530 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dgzbs" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.539724 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-98b2-account-create-update-648xj"] Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.540647 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-98b2-account-create-update-648xj" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.558597 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hdmw5" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.562468 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.571643 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-98b2-account-create-update-648xj"] Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.597810 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1f0a7c0-6169-479c-ac5c-9a30f7619603" path="/var/lib/kubelet/pods/e1f0a7c0-6169-479c-ac5c-9a30f7619603/volumes" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.613971 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td696\" (UniqueName: \"kubernetes.io/projected/26fadc7a-6cf8-4ea0-8609-50e585db4115-kube-api-access-td696\") pod \"neutron-98b2-account-create-update-648xj\" (UID: \"26fadc7a-6cf8-4ea0-8609-50e585db4115\") " pod="openstack/neutron-98b2-account-create-update-648xj" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.614347 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26fadc7a-6cf8-4ea0-8609-50e585db4115-operator-scripts\") pod \"neutron-98b2-account-create-update-648xj\" (UID: \"26fadc7a-6cf8-4ea0-8609-50e585db4115\") " pod="openstack/neutron-98b2-account-create-update-648xj" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.716619 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26fadc7a-6cf8-4ea0-8609-50e585db4115-operator-scripts\") pod \"neutron-98b2-account-create-update-648xj\" (UID: \"26fadc7a-6cf8-4ea0-8609-50e585db4115\") " pod="openstack/neutron-98b2-account-create-update-648xj" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.716746 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td696\" (UniqueName: \"kubernetes.io/projected/26fadc7a-6cf8-4ea0-8609-50e585db4115-kube-api-access-td696\") pod \"neutron-98b2-account-create-update-648xj\" (UID: \"26fadc7a-6cf8-4ea0-8609-50e585db4115\") " pod="openstack/neutron-98b2-account-create-update-648xj" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.717962 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26fadc7a-6cf8-4ea0-8609-50e585db4115-operator-scripts\") pod \"neutron-98b2-account-create-update-648xj\" (UID: \"26fadc7a-6cf8-4ea0-8609-50e585db4115\") " pod="openstack/neutron-98b2-account-create-update-648xj" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.741333 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td696\" (UniqueName: \"kubernetes.io/projected/26fadc7a-6cf8-4ea0-8609-50e585db4115-kube-api-access-td696\") pod \"neutron-98b2-account-create-update-648xj\" (UID: \"26fadc7a-6cf8-4ea0-8609-50e585db4115\") " pod="openstack/neutron-98b2-account-create-update-648xj" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.928585 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-98b2-account-create-update-648xj" Feb 17 13:46:12 crc kubenswrapper[4804]: I0217 13:46:12.946710 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-d59c-account-create-update-phgft"] Feb 17 13:46:12 crc kubenswrapper[4804]: W0217 13:46:12.955944 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35f1cc1f_a736_4c02_9c26_726c0c6f0d59.slice/crio-98275f0e919ccf44d3043fc8e10d4f4fd1dfbb384a1679b9b4a328c739fa2680 WatchSource:0}: Error finding container 98275f0e919ccf44d3043fc8e10d4f4fd1dfbb384a1679b9b4a328c739fa2680: Status 404 returned error can't find the container with id 98275f0e919ccf44d3043fc8e10d4f4fd1dfbb384a1679b9b4a328c739fa2680 Feb 17 13:46:13 crc kubenswrapper[4804]: I0217 13:46:13.049018 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-46zbc"] Feb 17 13:46:13 crc kubenswrapper[4804]: I0217 13:46:13.140279 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-7982-account-create-update-pd5b7"] Feb 17 13:46:13 crc kubenswrapper[4804]: I0217 13:46:13.246167 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-hdmw5"] Feb 17 13:46:13 crc kubenswrapper[4804]: I0217 13:46:13.322906 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-dgzbs"] Feb 17 13:46:13 crc kubenswrapper[4804]: W0217 13:46:13.347686 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd9036c7_1cff_4fb8_9af2_90057c4251dc.slice/crio-1b6b5b01d9570776dd1a52c7a9cea46e9e30d04a83c7ce13cfd62e96901e77de WatchSource:0}: Error finding container 1b6b5b01d9570776dd1a52c7a9cea46e9e30d04a83c7ce13cfd62e96901e77de: Status 404 returned error can't find the container with id 1b6b5b01d9570776dd1a52c7a9cea46e9e30d04a83c7ce13cfd62e96901e77de Feb 17 13:46:13 crc kubenswrapper[4804]: I0217 13:46:13.382418 4804 generic.go:334] "Generic (PLEG): container finished" podID="4895769c-ef45-40c8-a8ae-0c5cb954dab2" containerID="2af0e585925ef4ba3eb4997ba9a346fe72a20fb7f9f2943dcb04719e80a69278" exitCode=0 Feb 17 13:46:13 crc kubenswrapper[4804]: I0217 13:46:13.382508 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-ncwmc" event={"ID":"4895769c-ef45-40c8-a8ae-0c5cb954dab2","Type":"ContainerDied","Data":"2af0e585925ef4ba3eb4997ba9a346fe72a20fb7f9f2943dcb04719e80a69278"} Feb 17 13:46:13 crc kubenswrapper[4804]: I0217 13:46:13.382564 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-ncwmc" event={"ID":"4895769c-ef45-40c8-a8ae-0c5cb954dab2","Type":"ContainerStarted","Data":"c96f9b2d88b6adfa9b44dae8cb976cc639f7f821be9d2828db0153f577444bf0"} Feb 17 13:46:13 crc kubenswrapper[4804]: I0217 13:46:13.383976 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-7982-account-create-update-pd5b7" event={"ID":"e64978ab-e30e-4ebf-bce0-a8e29d5e5adc","Type":"ContainerStarted","Data":"584460ac40b379634789213fb9875e27bc44f0755fab8cd37c4bc1a2c224a708"} Feb 17 13:46:13 crc kubenswrapper[4804]: I0217 13:46:13.386120 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dgzbs" event={"ID":"fd9036c7-1cff-4fb8-9af2-90057c4251dc","Type":"ContainerStarted","Data":"1b6b5b01d9570776dd1a52c7a9cea46e9e30d04a83c7ce13cfd62e96901e77de"} Feb 17 13:46:13 crc kubenswrapper[4804]: I0217 13:46:13.387350 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hdmw5" event={"ID":"e26c9257-7102-4d48-8999-c0a3f0ca4009","Type":"ContainerStarted","Data":"ff80344ef1bac1d5d7fbda5968ed0c5c11a256e94e9d793d91dda8265547ba1a"} Feb 17 13:46:13 crc kubenswrapper[4804]: I0217 13:46:13.388954 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-46zbc" event={"ID":"60ee8426-dcbf-4430-8594-68ee778a8bbc","Type":"ContainerStarted","Data":"9bdfcabbaf1ee1e250875698a377ab6bde8ce671649b12731771caa70ec454c1"} Feb 17 13:46:13 crc kubenswrapper[4804]: I0217 13:46:13.388978 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-46zbc" event={"ID":"60ee8426-dcbf-4430-8594-68ee778a8bbc","Type":"ContainerStarted","Data":"601e52925bb957989c7f25f9d646d8656693a83832f601e09c2331054b955310"} Feb 17 13:46:13 crc kubenswrapper[4804]: I0217 13:46:13.394666 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d59c-account-create-update-phgft" event={"ID":"35f1cc1f-a736-4c02-9c26-726c0c6f0d59","Type":"ContainerStarted","Data":"195eb227b4e35d11d8a48fcc419fb067302eb3196988b8e72eeeeeb8aa5a6e2e"} Feb 17 13:46:13 crc kubenswrapper[4804]: I0217 13:46:13.394696 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d59c-account-create-update-phgft" event={"ID":"35f1cc1f-a736-4c02-9c26-726c0c6f0d59","Type":"ContainerStarted","Data":"98275f0e919ccf44d3043fc8e10d4f4fd1dfbb384a1679b9b4a328c739fa2680"} Feb 17 13:46:13 crc kubenswrapper[4804]: I0217 13:46:13.417162 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-46zbc" podStartSLOduration=2.41713908 podStartE2EDuration="2.41713908s" podCreationTimestamp="2026-02-17 13:46:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:46:13.415972974 +0000 UTC m=+1247.527392311" watchObservedRunningTime="2026-02-17 13:46:13.41713908 +0000 UTC m=+1247.528558417" Feb 17 13:46:13 crc kubenswrapper[4804]: I0217 13:46:13.444081 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-d59c-account-create-update-phgft" podStartSLOduration=2.444064177 podStartE2EDuration="2.444064177s" podCreationTimestamp="2026-02-17 13:46:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:46:13.438425539 +0000 UTC m=+1247.549844876" watchObservedRunningTime="2026-02-17 13:46:13.444064177 +0000 UTC m=+1247.555483514" Feb 17 13:46:13 crc kubenswrapper[4804]: I0217 13:46:13.460855 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-98b2-account-create-update-648xj"] Feb 17 13:46:14 crc kubenswrapper[4804]: I0217 13:46:14.411910 4804 generic.go:334] "Generic (PLEG): container finished" podID="e26c9257-7102-4d48-8999-c0a3f0ca4009" containerID="7dbf5f5d88a50f9cfadbbf6692ca887131d2b4df1c33d00e1f7267394ff4525b" exitCode=0 Feb 17 13:46:14 crc kubenswrapper[4804]: I0217 13:46:14.412055 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hdmw5" event={"ID":"e26c9257-7102-4d48-8999-c0a3f0ca4009","Type":"ContainerDied","Data":"7dbf5f5d88a50f9cfadbbf6692ca887131d2b4df1c33d00e1f7267394ff4525b"} Feb 17 13:46:14 crc kubenswrapper[4804]: I0217 13:46:14.413863 4804 generic.go:334] "Generic (PLEG): container finished" podID="60ee8426-dcbf-4430-8594-68ee778a8bbc" containerID="9bdfcabbaf1ee1e250875698a377ab6bde8ce671649b12731771caa70ec454c1" exitCode=0 Feb 17 13:46:14 crc kubenswrapper[4804]: I0217 13:46:14.413947 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-46zbc" event={"ID":"60ee8426-dcbf-4430-8594-68ee778a8bbc","Type":"ContainerDied","Data":"9bdfcabbaf1ee1e250875698a377ab6bde8ce671649b12731771caa70ec454c1"} Feb 17 13:46:14 crc kubenswrapper[4804]: I0217 13:46:14.415860 4804 generic.go:334] "Generic (PLEG): container finished" podID="35f1cc1f-a736-4c02-9c26-726c0c6f0d59" containerID="195eb227b4e35d11d8a48fcc419fb067302eb3196988b8e72eeeeeb8aa5a6e2e" exitCode=0 Feb 17 13:46:14 crc kubenswrapper[4804]: I0217 13:46:14.415938 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d59c-account-create-update-phgft" event={"ID":"35f1cc1f-a736-4c02-9c26-726c0c6f0d59","Type":"ContainerDied","Data":"195eb227b4e35d11d8a48fcc419fb067302eb3196988b8e72eeeeeb8aa5a6e2e"} Feb 17 13:46:14 crc kubenswrapper[4804]: I0217 13:46:14.417744 4804 generic.go:334] "Generic (PLEG): container finished" podID="26fadc7a-6cf8-4ea0-8609-50e585db4115" containerID="e16d35978c1a93f38aec046090d4bb89a7fa37eda37be7158b82151bac67e327" exitCode=0 Feb 17 13:46:14 crc kubenswrapper[4804]: I0217 13:46:14.417808 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-98b2-account-create-update-648xj" event={"ID":"26fadc7a-6cf8-4ea0-8609-50e585db4115","Type":"ContainerDied","Data":"e16d35978c1a93f38aec046090d4bb89a7fa37eda37be7158b82151bac67e327"} Feb 17 13:46:14 crc kubenswrapper[4804]: I0217 13:46:14.417830 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-98b2-account-create-update-648xj" event={"ID":"26fadc7a-6cf8-4ea0-8609-50e585db4115","Type":"ContainerStarted","Data":"86de1d053faa9d936fb63918e324800451303ce7017f7f3db74c27bc93776276"} Feb 17 13:46:14 crc kubenswrapper[4804]: I0217 13:46:14.419408 4804 generic.go:334] "Generic (PLEG): container finished" podID="e64978ab-e30e-4ebf-bce0-a8e29d5e5adc" containerID="4a8cd13cbb3ba23bfa180f42dc167734c03b2d4bcdf0842db5532816b1f0b9bd" exitCode=0 Feb 17 13:46:14 crc kubenswrapper[4804]: I0217 13:46:14.419461 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-7982-account-create-update-pd5b7" event={"ID":"e64978ab-e30e-4ebf-bce0-a8e29d5e5adc","Type":"ContainerDied","Data":"4a8cd13cbb3ba23bfa180f42dc167734c03b2d4bcdf0842db5532816b1f0b9bd"} Feb 17 13:46:14 crc kubenswrapper[4804]: I0217 13:46:14.741142 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-ncwmc" Feb 17 13:46:14 crc kubenswrapper[4804]: I0217 13:46:14.854420 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4895769c-ef45-40c8-a8ae-0c5cb954dab2-operator-scripts\") pod \"4895769c-ef45-40c8-a8ae-0c5cb954dab2\" (UID: \"4895769c-ef45-40c8-a8ae-0c5cb954dab2\") " Feb 17 13:46:14 crc kubenswrapper[4804]: I0217 13:46:14.855124 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4895769c-ef45-40c8-a8ae-0c5cb954dab2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4895769c-ef45-40c8-a8ae-0c5cb954dab2" (UID: "4895769c-ef45-40c8-a8ae-0c5cb954dab2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:14 crc kubenswrapper[4804]: I0217 13:46:14.855505 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmllm\" (UniqueName: \"kubernetes.io/projected/4895769c-ef45-40c8-a8ae-0c5cb954dab2-kube-api-access-nmllm\") pod \"4895769c-ef45-40c8-a8ae-0c5cb954dab2\" (UID: \"4895769c-ef45-40c8-a8ae-0c5cb954dab2\") " Feb 17 13:46:14 crc kubenswrapper[4804]: I0217 13:46:14.855920 4804 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4895769c-ef45-40c8-a8ae-0c5cb954dab2-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:14 crc kubenswrapper[4804]: I0217 13:46:14.866573 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4895769c-ef45-40c8-a8ae-0c5cb954dab2-kube-api-access-nmllm" (OuterVolumeSpecName: "kube-api-access-nmllm") pod "4895769c-ef45-40c8-a8ae-0c5cb954dab2" (UID: "4895769c-ef45-40c8-a8ae-0c5cb954dab2"). InnerVolumeSpecName "kube-api-access-nmllm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:14 crc kubenswrapper[4804]: I0217 13:46:14.957492 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmllm\" (UniqueName: \"kubernetes.io/projected/4895769c-ef45-40c8-a8ae-0c5cb954dab2-kube-api-access-nmllm\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:15 crc kubenswrapper[4804]: I0217 13:46:15.428837 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-ncwmc" Feb 17 13:46:15 crc kubenswrapper[4804]: I0217 13:46:15.430284 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-ncwmc" event={"ID":"4895769c-ef45-40c8-a8ae-0c5cb954dab2","Type":"ContainerDied","Data":"c96f9b2d88b6adfa9b44dae8cb976cc639f7f821be9d2828db0153f577444bf0"} Feb 17 13:46:15 crc kubenswrapper[4804]: I0217 13:46:15.430322 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c96f9b2d88b6adfa9b44dae8cb976cc639f7f821be9d2828db0153f577444bf0" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.273490 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-46zbc" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.280982 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-7982-account-create-update-pd5b7" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.291928 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-98b2-account-create-update-648xj" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.295176 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d59c-account-create-update-phgft" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.311173 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hdmw5" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.316355 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e64978ab-e30e-4ebf-bce0-a8e29d5e5adc-operator-scripts\") pod \"e64978ab-e30e-4ebf-bce0-a8e29d5e5adc\" (UID: \"e64978ab-e30e-4ebf-bce0-a8e29d5e5adc\") " Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.316625 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gwpw\" (UniqueName: \"kubernetes.io/projected/e64978ab-e30e-4ebf-bce0-a8e29d5e5adc-kube-api-access-2gwpw\") pod \"e64978ab-e30e-4ebf-bce0-a8e29d5e5adc\" (UID: \"e64978ab-e30e-4ebf-bce0-a8e29d5e5adc\") " Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.316808 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rt87n\" (UniqueName: \"kubernetes.io/projected/60ee8426-dcbf-4430-8594-68ee778a8bbc-kube-api-access-rt87n\") pod \"60ee8426-dcbf-4430-8594-68ee778a8bbc\" (UID: \"60ee8426-dcbf-4430-8594-68ee778a8bbc\") " Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.317002 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60ee8426-dcbf-4430-8594-68ee778a8bbc-operator-scripts\") pod \"60ee8426-dcbf-4430-8594-68ee778a8bbc\" (UID: \"60ee8426-dcbf-4430-8594-68ee778a8bbc\") " Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.317157 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26fadc7a-6cf8-4ea0-8609-50e585db4115-operator-scripts\") pod \"26fadc7a-6cf8-4ea0-8609-50e585db4115\" (UID: \"26fadc7a-6cf8-4ea0-8609-50e585db4115\") " Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.317290 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-td696\" (UniqueName: \"kubernetes.io/projected/26fadc7a-6cf8-4ea0-8609-50e585db4115-kube-api-access-td696\") pod \"26fadc7a-6cf8-4ea0-8609-50e585db4115\" (UID: \"26fadc7a-6cf8-4ea0-8609-50e585db4115\") " Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.317292 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e64978ab-e30e-4ebf-bce0-a8e29d5e5adc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e64978ab-e30e-4ebf-bce0-a8e29d5e5adc" (UID: "e64978ab-e30e-4ebf-bce0-a8e29d5e5adc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.317582 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60ee8426-dcbf-4430-8594-68ee778a8bbc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "60ee8426-dcbf-4430-8594-68ee778a8bbc" (UID: "60ee8426-dcbf-4430-8594-68ee778a8bbc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.317745 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26fadc7a-6cf8-4ea0-8609-50e585db4115-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "26fadc7a-6cf8-4ea0-8609-50e585db4115" (UID: "26fadc7a-6cf8-4ea0-8609-50e585db4115"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.318154 4804 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e64978ab-e30e-4ebf-bce0-a8e29d5e5adc-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.318179 4804 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60ee8426-dcbf-4430-8594-68ee778a8bbc-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.318193 4804 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26fadc7a-6cf8-4ea0-8609-50e585db4115-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.324702 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60ee8426-dcbf-4430-8594-68ee778a8bbc-kube-api-access-rt87n" (OuterVolumeSpecName: "kube-api-access-rt87n") pod "60ee8426-dcbf-4430-8594-68ee778a8bbc" (UID: "60ee8426-dcbf-4430-8594-68ee778a8bbc"). InnerVolumeSpecName "kube-api-access-rt87n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.327736 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e64978ab-e30e-4ebf-bce0-a8e29d5e5adc-kube-api-access-2gwpw" (OuterVolumeSpecName: "kube-api-access-2gwpw") pod "e64978ab-e30e-4ebf-bce0-a8e29d5e5adc" (UID: "e64978ab-e30e-4ebf-bce0-a8e29d5e5adc"). InnerVolumeSpecName "kube-api-access-2gwpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.329322 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26fadc7a-6cf8-4ea0-8609-50e585db4115-kube-api-access-td696" (OuterVolumeSpecName: "kube-api-access-td696") pod "26fadc7a-6cf8-4ea0-8609-50e585db4115" (UID: "26fadc7a-6cf8-4ea0-8609-50e585db4115"). InnerVolumeSpecName "kube-api-access-td696". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.418770 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35f1cc1f-a736-4c02-9c26-726c0c6f0d59-operator-scripts\") pod \"35f1cc1f-a736-4c02-9c26-726c0c6f0d59\" (UID: \"35f1cc1f-a736-4c02-9c26-726c0c6f0d59\") " Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.419389 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsf6h\" (UniqueName: \"kubernetes.io/projected/e26c9257-7102-4d48-8999-c0a3f0ca4009-kube-api-access-zsf6h\") pod \"e26c9257-7102-4d48-8999-c0a3f0ca4009\" (UID: \"e26c9257-7102-4d48-8999-c0a3f0ca4009\") " Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.419443 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35f1cc1f-a736-4c02-9c26-726c0c6f0d59-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "35f1cc1f-a736-4c02-9c26-726c0c6f0d59" (UID: "35f1cc1f-a736-4c02-9c26-726c0c6f0d59"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.419463 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvb4p\" (UniqueName: \"kubernetes.io/projected/35f1cc1f-a736-4c02-9c26-726c0c6f0d59-kube-api-access-wvb4p\") pod \"35f1cc1f-a736-4c02-9c26-726c0c6f0d59\" (UID: \"35f1cc1f-a736-4c02-9c26-726c0c6f0d59\") " Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.419516 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e26c9257-7102-4d48-8999-c0a3f0ca4009-operator-scripts\") pod \"e26c9257-7102-4d48-8999-c0a3f0ca4009\" (UID: \"e26c9257-7102-4d48-8999-c0a3f0ca4009\") " Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.419899 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rt87n\" (UniqueName: \"kubernetes.io/projected/60ee8426-dcbf-4430-8594-68ee778a8bbc-kube-api-access-rt87n\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.419923 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-td696\" (UniqueName: \"kubernetes.io/projected/26fadc7a-6cf8-4ea0-8609-50e585db4115-kube-api-access-td696\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.419936 4804 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35f1cc1f-a736-4c02-9c26-726c0c6f0d59-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.419953 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gwpw\" (UniqueName: \"kubernetes.io/projected/e64978ab-e30e-4ebf-bce0-a8e29d5e5adc-kube-api-access-2gwpw\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.419933 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e26c9257-7102-4d48-8999-c0a3f0ca4009-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e26c9257-7102-4d48-8999-c0a3f0ca4009" (UID: "e26c9257-7102-4d48-8999-c0a3f0ca4009"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.424175 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e26c9257-7102-4d48-8999-c0a3f0ca4009-kube-api-access-zsf6h" (OuterVolumeSpecName: "kube-api-access-zsf6h") pod "e26c9257-7102-4d48-8999-c0a3f0ca4009" (UID: "e26c9257-7102-4d48-8999-c0a3f0ca4009"). InnerVolumeSpecName "kube-api-access-zsf6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.424314 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35f1cc1f-a736-4c02-9c26-726c0c6f0d59-kube-api-access-wvb4p" (OuterVolumeSpecName: "kube-api-access-wvb4p") pod "35f1cc1f-a736-4c02-9c26-726c0c6f0d59" (UID: "35f1cc1f-a736-4c02-9c26-726c0c6f0d59"). InnerVolumeSpecName "kube-api-access-wvb4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.457376 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-46zbc" event={"ID":"60ee8426-dcbf-4430-8594-68ee778a8bbc","Type":"ContainerDied","Data":"601e52925bb957989c7f25f9d646d8656693a83832f601e09c2331054b955310"} Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.457448 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="601e52925bb957989c7f25f9d646d8656693a83832f601e09c2331054b955310" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.457410 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-46zbc" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.460847 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d59c-account-create-update-phgft" event={"ID":"35f1cc1f-a736-4c02-9c26-726c0c6f0d59","Type":"ContainerDied","Data":"98275f0e919ccf44d3043fc8e10d4f4fd1dfbb384a1679b9b4a328c739fa2680"} Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.460977 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98275f0e919ccf44d3043fc8e10d4f4fd1dfbb384a1679b9b4a328c739fa2680" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.460865 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d59c-account-create-update-phgft" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.462310 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-98b2-account-create-update-648xj" event={"ID":"26fadc7a-6cf8-4ea0-8609-50e585db4115","Type":"ContainerDied","Data":"86de1d053faa9d936fb63918e324800451303ce7017f7f3db74c27bc93776276"} Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.462364 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-98b2-account-create-update-648xj" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.462382 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86de1d053faa9d936fb63918e324800451303ce7017f7f3db74c27bc93776276" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.464624 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-7982-account-create-update-pd5b7" event={"ID":"e64978ab-e30e-4ebf-bce0-a8e29d5e5adc","Type":"ContainerDied","Data":"584460ac40b379634789213fb9875e27bc44f0755fab8cd37c4bc1a2c224a708"} Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.464674 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-7982-account-create-update-pd5b7" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.464662 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="584460ac40b379634789213fb9875e27bc44f0755fab8cd37c4bc1a2c224a708" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.466095 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hdmw5" event={"ID":"e26c9257-7102-4d48-8999-c0a3f0ca4009","Type":"ContainerDied","Data":"ff80344ef1bac1d5d7fbda5968ed0c5c11a256e94e9d793d91dda8265547ba1a"} Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.466145 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff80344ef1bac1d5d7fbda5968ed0c5c11a256e94e9d793d91dda8265547ba1a" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.466221 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hdmw5" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.521220 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsf6h\" (UniqueName: \"kubernetes.io/projected/e26c9257-7102-4d48-8999-c0a3f0ca4009-kube-api-access-zsf6h\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.521253 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvb4p\" (UniqueName: \"kubernetes.io/projected/35f1cc1f-a736-4c02-9c26-726c0c6f0d59-kube-api-access-wvb4p\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:18 crc kubenswrapper[4804]: I0217 13:46:18.521265 4804 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e26c9257-7102-4d48-8999-c0a3f0ca4009-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:19 crc kubenswrapper[4804]: I0217 13:46:19.170346 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:19 crc kubenswrapper[4804]: I0217 13:46:19.245699 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-mp4l9"] Feb 17 13:46:19 crc kubenswrapper[4804]: I0217 13:46:19.250872 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" podUID="86aca321-b4a3-4d89-ab34-5d311aa11fe9" containerName="dnsmasq-dns" containerID="cri-o://f571b6a8fe768059eccc20a850b20c199a06962e3cc1247c34676a921af27d49" gracePeriod=10 Feb 17 13:46:19 crc kubenswrapper[4804]: I0217 13:46:19.480345 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dgzbs" event={"ID":"fd9036c7-1cff-4fb8-9af2-90057c4251dc","Type":"ContainerStarted","Data":"ed7f04a5bf7a47131ede3cac958534ed66f33e1ae426c629f9157f389db06cde"} Feb 17 13:46:19 crc kubenswrapper[4804]: I0217 13:46:19.505100 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-dgzbs" podStartSLOduration=2.243136194 podStartE2EDuration="7.505071385s" podCreationTimestamp="2026-02-17 13:46:12 +0000 UTC" firstStartedPulling="2026-02-17 13:46:13.350576067 +0000 UTC m=+1247.461995404" lastFinishedPulling="2026-02-17 13:46:18.612511248 +0000 UTC m=+1252.723930595" observedRunningTime="2026-02-17 13:46:19.499256723 +0000 UTC m=+1253.610676060" watchObservedRunningTime="2026-02-17 13:46:19.505071385 +0000 UTC m=+1253.616490722" Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.238494 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.351596 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-dns-svc\") pod \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\" (UID: \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\") " Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.351750 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-ovsdbserver-sb\") pod \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\" (UID: \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\") " Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.351809 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-config\") pod \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\" (UID: \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\") " Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.351874 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfbbq\" (UniqueName: \"kubernetes.io/projected/86aca321-b4a3-4d89-ab34-5d311aa11fe9-kube-api-access-zfbbq\") pod \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\" (UID: \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\") " Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.351930 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-ovsdbserver-nb\") pod \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\" (UID: \"86aca321-b4a3-4d89-ab34-5d311aa11fe9\") " Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.358813 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86aca321-b4a3-4d89-ab34-5d311aa11fe9-kube-api-access-zfbbq" (OuterVolumeSpecName: "kube-api-access-zfbbq") pod "86aca321-b4a3-4d89-ab34-5d311aa11fe9" (UID: "86aca321-b4a3-4d89-ab34-5d311aa11fe9"). InnerVolumeSpecName "kube-api-access-zfbbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.399284 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "86aca321-b4a3-4d89-ab34-5d311aa11fe9" (UID: "86aca321-b4a3-4d89-ab34-5d311aa11fe9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.405554 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "86aca321-b4a3-4d89-ab34-5d311aa11fe9" (UID: "86aca321-b4a3-4d89-ab34-5d311aa11fe9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.414880 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "86aca321-b4a3-4d89-ab34-5d311aa11fe9" (UID: "86aca321-b4a3-4d89-ab34-5d311aa11fe9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.419918 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-config" (OuterVolumeSpecName: "config") pod "86aca321-b4a3-4d89-ab34-5d311aa11fe9" (UID: "86aca321-b4a3-4d89-ab34-5d311aa11fe9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.454163 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.454240 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.454255 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfbbq\" (UniqueName: \"kubernetes.io/projected/86aca321-b4a3-4d89-ab34-5d311aa11fe9-kube-api-access-zfbbq\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.454266 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.454277 4804 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86aca321-b4a3-4d89-ab34-5d311aa11fe9-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.492080 4804 generic.go:334] "Generic (PLEG): container finished" podID="86aca321-b4a3-4d89-ab34-5d311aa11fe9" containerID="f571b6a8fe768059eccc20a850b20c199a06962e3cc1247c34676a921af27d49" exitCode=0 Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.492177 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.492266 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" event={"ID":"86aca321-b4a3-4d89-ab34-5d311aa11fe9","Type":"ContainerDied","Data":"f571b6a8fe768059eccc20a850b20c199a06962e3cc1247c34676a921af27d49"} Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.492311 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-mp4l9" event={"ID":"86aca321-b4a3-4d89-ab34-5d311aa11fe9","Type":"ContainerDied","Data":"cc022082e5090f1a0915d5020212d3b1d395c728921a669b0ed6b89573f0530f"} Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.492341 4804 scope.go:117] "RemoveContainer" containerID="f571b6a8fe768059eccc20a850b20c199a06962e3cc1247c34676a921af27d49" Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.541521 4804 scope.go:117] "RemoveContainer" containerID="2dffc4a386295a9c8c5efcda69222c6e5a062401a2d3d79656ce2b29324ac9c0" Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.542387 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-mp4l9"] Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.552063 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-mp4l9"] Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.570380 4804 scope.go:117] "RemoveContainer" containerID="f571b6a8fe768059eccc20a850b20c199a06962e3cc1247c34676a921af27d49" Feb 17 13:46:20 crc kubenswrapper[4804]: E0217 13:46:20.570967 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f571b6a8fe768059eccc20a850b20c199a06962e3cc1247c34676a921af27d49\": container with ID starting with f571b6a8fe768059eccc20a850b20c199a06962e3cc1247c34676a921af27d49 not found: ID does not exist" containerID="f571b6a8fe768059eccc20a850b20c199a06962e3cc1247c34676a921af27d49" Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.571024 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f571b6a8fe768059eccc20a850b20c199a06962e3cc1247c34676a921af27d49"} err="failed to get container status \"f571b6a8fe768059eccc20a850b20c199a06962e3cc1247c34676a921af27d49\": rpc error: code = NotFound desc = could not find container \"f571b6a8fe768059eccc20a850b20c199a06962e3cc1247c34676a921af27d49\": container with ID starting with f571b6a8fe768059eccc20a850b20c199a06962e3cc1247c34676a921af27d49 not found: ID does not exist" Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.571057 4804 scope.go:117] "RemoveContainer" containerID="2dffc4a386295a9c8c5efcda69222c6e5a062401a2d3d79656ce2b29324ac9c0" Feb 17 13:46:20 crc kubenswrapper[4804]: E0217 13:46:20.571559 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dffc4a386295a9c8c5efcda69222c6e5a062401a2d3d79656ce2b29324ac9c0\": container with ID starting with 2dffc4a386295a9c8c5efcda69222c6e5a062401a2d3d79656ce2b29324ac9c0 not found: ID does not exist" containerID="2dffc4a386295a9c8c5efcda69222c6e5a062401a2d3d79656ce2b29324ac9c0" Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.571716 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dffc4a386295a9c8c5efcda69222c6e5a062401a2d3d79656ce2b29324ac9c0"} err="failed to get container status \"2dffc4a386295a9c8c5efcda69222c6e5a062401a2d3d79656ce2b29324ac9c0\": rpc error: code = NotFound desc = could not find container \"2dffc4a386295a9c8c5efcda69222c6e5a062401a2d3d79656ce2b29324ac9c0\": container with ID starting with 2dffc4a386295a9c8c5efcda69222c6e5a062401a2d3d79656ce2b29324ac9c0 not found: ID does not exist" Feb 17 13:46:20 crc kubenswrapper[4804]: I0217 13:46:20.586079 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86aca321-b4a3-4d89-ab34-5d311aa11fe9" path="/var/lib/kubelet/pods/86aca321-b4a3-4d89-ab34-5d311aa11fe9/volumes" Feb 17 13:46:22 crc kubenswrapper[4804]: I0217 13:46:22.512100 4804 generic.go:334] "Generic (PLEG): container finished" podID="fd9036c7-1cff-4fb8-9af2-90057c4251dc" containerID="ed7f04a5bf7a47131ede3cac958534ed66f33e1ae426c629f9157f389db06cde" exitCode=0 Feb 17 13:46:22 crc kubenswrapper[4804]: I0217 13:46:22.512214 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dgzbs" event={"ID":"fd9036c7-1cff-4fb8-9af2-90057c4251dc","Type":"ContainerDied","Data":"ed7f04a5bf7a47131ede3cac958534ed66f33e1ae426c629f9157f389db06cde"} Feb 17 13:46:23 crc kubenswrapper[4804]: I0217 13:46:23.823862 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dgzbs" Feb 17 13:46:23 crc kubenswrapper[4804]: I0217 13:46:23.913500 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd9036c7-1cff-4fb8-9af2-90057c4251dc-combined-ca-bundle\") pod \"fd9036c7-1cff-4fb8-9af2-90057c4251dc\" (UID: \"fd9036c7-1cff-4fb8-9af2-90057c4251dc\") " Feb 17 13:46:23 crc kubenswrapper[4804]: I0217 13:46:23.913608 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd9036c7-1cff-4fb8-9af2-90057c4251dc-config-data\") pod \"fd9036c7-1cff-4fb8-9af2-90057c4251dc\" (UID: \"fd9036c7-1cff-4fb8-9af2-90057c4251dc\") " Feb 17 13:46:23 crc kubenswrapper[4804]: I0217 13:46:23.913710 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rn9kw\" (UniqueName: \"kubernetes.io/projected/fd9036c7-1cff-4fb8-9af2-90057c4251dc-kube-api-access-rn9kw\") pod \"fd9036c7-1cff-4fb8-9af2-90057c4251dc\" (UID: \"fd9036c7-1cff-4fb8-9af2-90057c4251dc\") " Feb 17 13:46:23 crc kubenswrapper[4804]: I0217 13:46:23.918883 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd9036c7-1cff-4fb8-9af2-90057c4251dc-kube-api-access-rn9kw" (OuterVolumeSpecName: "kube-api-access-rn9kw") pod "fd9036c7-1cff-4fb8-9af2-90057c4251dc" (UID: "fd9036c7-1cff-4fb8-9af2-90057c4251dc"). InnerVolumeSpecName "kube-api-access-rn9kw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:23 crc kubenswrapper[4804]: I0217 13:46:23.936348 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd9036c7-1cff-4fb8-9af2-90057c4251dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fd9036c7-1cff-4fb8-9af2-90057c4251dc" (UID: "fd9036c7-1cff-4fb8-9af2-90057c4251dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:23 crc kubenswrapper[4804]: I0217 13:46:23.960999 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd9036c7-1cff-4fb8-9af2-90057c4251dc-config-data" (OuterVolumeSpecName: "config-data") pod "fd9036c7-1cff-4fb8-9af2-90057c4251dc" (UID: "fd9036c7-1cff-4fb8-9af2-90057c4251dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.015743 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd9036c7-1cff-4fb8-9af2-90057c4251dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.015783 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd9036c7-1cff-4fb8-9af2-90057c4251dc-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.015799 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rn9kw\" (UniqueName: \"kubernetes.io/projected/fd9036c7-1cff-4fb8-9af2-90057c4251dc-kube-api-access-rn9kw\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.530430 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dgzbs" event={"ID":"fd9036c7-1cff-4fb8-9af2-90057c4251dc","Type":"ContainerDied","Data":"1b6b5b01d9570776dd1a52c7a9cea46e9e30d04a83c7ce13cfd62e96901e77de"} Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.530470 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b6b5b01d9570776dd1a52c7a9cea46e9e30d04a83c7ce13cfd62e96901e77de" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.530536 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dgzbs" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.799456 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-m5j4j"] Feb 17 13:46:24 crc kubenswrapper[4804]: E0217 13:46:24.799880 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60ee8426-dcbf-4430-8594-68ee778a8bbc" containerName="mariadb-database-create" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.799900 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="60ee8426-dcbf-4430-8594-68ee778a8bbc" containerName="mariadb-database-create" Feb 17 13:46:24 crc kubenswrapper[4804]: E0217 13:46:24.799922 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd9036c7-1cff-4fb8-9af2-90057c4251dc" containerName="keystone-db-sync" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.799931 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd9036c7-1cff-4fb8-9af2-90057c4251dc" containerName="keystone-db-sync" Feb 17 13:46:24 crc kubenswrapper[4804]: E0217 13:46:24.799949 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86aca321-b4a3-4d89-ab34-5d311aa11fe9" containerName="init" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.799958 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="86aca321-b4a3-4d89-ab34-5d311aa11fe9" containerName="init" Feb 17 13:46:24 crc kubenswrapper[4804]: E0217 13:46:24.799971 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e26c9257-7102-4d48-8999-c0a3f0ca4009" containerName="mariadb-database-create" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.799978 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="e26c9257-7102-4d48-8999-c0a3f0ca4009" containerName="mariadb-database-create" Feb 17 13:46:24 crc kubenswrapper[4804]: E0217 13:46:24.799993 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26fadc7a-6cf8-4ea0-8609-50e585db4115" containerName="mariadb-account-create-update" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.800002 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="26fadc7a-6cf8-4ea0-8609-50e585db4115" containerName="mariadb-account-create-update" Feb 17 13:46:24 crc kubenswrapper[4804]: E0217 13:46:24.800012 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4895769c-ef45-40c8-a8ae-0c5cb954dab2" containerName="mariadb-database-create" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.800019 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="4895769c-ef45-40c8-a8ae-0c5cb954dab2" containerName="mariadb-database-create" Feb 17 13:46:24 crc kubenswrapper[4804]: E0217 13:46:24.800037 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35f1cc1f-a736-4c02-9c26-726c0c6f0d59" containerName="mariadb-account-create-update" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.800045 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="35f1cc1f-a736-4c02-9c26-726c0c6f0d59" containerName="mariadb-account-create-update" Feb 17 13:46:24 crc kubenswrapper[4804]: E0217 13:46:24.800063 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e64978ab-e30e-4ebf-bce0-a8e29d5e5adc" containerName="mariadb-account-create-update" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.800070 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="e64978ab-e30e-4ebf-bce0-a8e29d5e5adc" containerName="mariadb-account-create-update" Feb 17 13:46:24 crc kubenswrapper[4804]: E0217 13:46:24.800083 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86aca321-b4a3-4d89-ab34-5d311aa11fe9" containerName="dnsmasq-dns" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.800090 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="86aca321-b4a3-4d89-ab34-5d311aa11fe9" containerName="dnsmasq-dns" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.800287 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="86aca321-b4a3-4d89-ab34-5d311aa11fe9" containerName="dnsmasq-dns" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.800305 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="26fadc7a-6cf8-4ea0-8609-50e585db4115" containerName="mariadb-account-create-update" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.800315 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="60ee8426-dcbf-4430-8594-68ee778a8bbc" containerName="mariadb-database-create" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.800329 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="35f1cc1f-a736-4c02-9c26-726c0c6f0d59" containerName="mariadb-account-create-update" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.800341 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="4895769c-ef45-40c8-a8ae-0c5cb954dab2" containerName="mariadb-database-create" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.800353 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="e26c9257-7102-4d48-8999-c0a3f0ca4009" containerName="mariadb-database-create" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.800361 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="e64978ab-e30e-4ebf-bce0-a8e29d5e5adc" containerName="mariadb-account-create-update" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.800370 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd9036c7-1cff-4fb8-9af2-90057c4251dc" containerName="keystone-db-sync" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.801380 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.821467 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-m5j4j"] Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.830886 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-m5j4j\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.830967 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-m5j4j\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.831027 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-m5j4j\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.831088 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-m5j4j\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.831119 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-config\") pod \"dnsmasq-dns-bbf5cc879-m5j4j\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.831161 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prjx2\" (UniqueName: \"kubernetes.io/projected/c8639d8d-c367-40a9-b26c-c7c301b82609-kube-api-access-prjx2\") pod \"dnsmasq-dns-bbf5cc879-m5j4j\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.842120 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-kmbrx"] Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.843218 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.846974 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.847318 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.847700 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.848476 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.848631 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-2fq28" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.877674 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-kmbrx"] Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.933614 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-fernet-keys\") pod \"keystone-bootstrap-kmbrx\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.933963 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2bf4\" (UniqueName: \"kubernetes.io/projected/53073bd8-b356-4cb8-a190-db417f233b63-kube-api-access-j2bf4\") pod \"keystone-bootstrap-kmbrx\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.934090 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prjx2\" (UniqueName: \"kubernetes.io/projected/c8639d8d-c367-40a9-b26c-c7c301b82609-kube-api-access-prjx2\") pod \"dnsmasq-dns-bbf5cc879-m5j4j\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.934247 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-m5j4j\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.934427 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-m5j4j\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.934526 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-config-data\") pod \"keystone-bootstrap-kmbrx\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.934635 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-credential-keys\") pod \"keystone-bootstrap-kmbrx\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.934745 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-m5j4j\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.934855 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-combined-ca-bundle\") pod \"keystone-bootstrap-kmbrx\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.934996 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-scripts\") pod \"keystone-bootstrap-kmbrx\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.935106 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-m5j4j\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.937041 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-config\") pod \"dnsmasq-dns-bbf5cc879-m5j4j\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.937622 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-m5j4j\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.938974 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-m5j4j\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.939017 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-m5j4j\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.939285 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-config\") pod \"dnsmasq-dns-bbf5cc879-m5j4j\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.939580 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-m5j4j\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:24 crc kubenswrapper[4804]: I0217 13:46:24.992841 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prjx2\" (UniqueName: \"kubernetes.io/projected/c8639d8d-c367-40a9-b26c-c7c301b82609-kube-api-access-prjx2\") pod \"dnsmasq-dns-bbf5cc879-m5j4j\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.041063 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-credential-keys\") pod \"keystone-bootstrap-kmbrx\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.041135 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-combined-ca-bundle\") pod \"keystone-bootstrap-kmbrx\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.041176 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-scripts\") pod \"keystone-bootstrap-kmbrx\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.041226 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-fernet-keys\") pod \"keystone-bootstrap-kmbrx\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.041250 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2bf4\" (UniqueName: \"kubernetes.io/projected/53073bd8-b356-4cb8-a190-db417f233b63-kube-api-access-j2bf4\") pod \"keystone-bootstrap-kmbrx\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.041305 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-config-data\") pod \"keystone-bootstrap-kmbrx\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.049260 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-f9zkj"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.054357 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.059869 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-r5hqb" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.060243 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.062786 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-scripts\") pod \"keystone-bootstrap-kmbrx\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.062885 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-fernet-keys\") pod \"keystone-bootstrap-kmbrx\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.066608 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.066911 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-config-data\") pod \"keystone-bootstrap-kmbrx\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.066980 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-combined-ca-bundle\") pod \"keystone-bootstrap-kmbrx\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.082262 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-credential-keys\") pod \"keystone-bootstrap-kmbrx\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.092603 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-f9zkj"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.093111 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2bf4\" (UniqueName: \"kubernetes.io/projected/53073bd8-b356-4cb8-a190-db417f233b63-kube-api-access-j2bf4\") pod \"keystone-bootstrap-kmbrx\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.114435 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-d659f57fc-rp4h6"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.116183 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-d659f57fc-rp4h6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.121565 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.121803 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-75jkk" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.122003 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.123882 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.124613 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.142661 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-d659f57fc-rp4h6"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.151014 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-db-sync-config-data\") pod \"cinder-db-sync-f9zkj\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.151078 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-horizon-secret-key\") pod \"horizon-d659f57fc-rp4h6\" (UID: \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\") " pod="openstack/horizon-d659f57fc-rp4h6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.151103 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-scripts\") pod \"cinder-db-sync-f9zkj\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.151124 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-config-data\") pod \"cinder-db-sync-f9zkj\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.151154 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-config-data\") pod \"horizon-d659f57fc-rp4h6\" (UID: \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\") " pod="openstack/horizon-d659f57fc-rp4h6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.151173 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-scripts\") pod \"horizon-d659f57fc-rp4h6\" (UID: \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\") " pod="openstack/horizon-d659f57fc-rp4h6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.151212 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-combined-ca-bundle\") pod \"cinder-db-sync-f9zkj\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.151341 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trmx2\" (UniqueName: \"kubernetes.io/projected/02a921c8-6579-451b-beaf-9832cf900668-kube-api-access-trmx2\") pod \"cinder-db-sync-f9zkj\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.151371 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42bnj\" (UniqueName: \"kubernetes.io/projected/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-kube-api-access-42bnj\") pod \"horizon-d659f57fc-rp4h6\" (UID: \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\") " pod="openstack/horizon-d659f57fc-rp4h6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.151398 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/02a921c8-6579-451b-beaf-9832cf900668-etc-machine-id\") pod \"cinder-db-sync-f9zkj\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.151420 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-logs\") pod \"horizon-d659f57fc-rp4h6\" (UID: \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\") " pod="openstack/horizon-d659f57fc-rp4h6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.168576 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.204697 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.207618 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.216948 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.217178 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.230315 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.253492 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-combined-ca-bundle\") pod \"cinder-db-sync-f9zkj\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.253547 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5ccd477-88cd-4284-9de7-f336def1c7a1-log-httpd\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.253600 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jjjd\" (UniqueName: \"kubernetes.io/projected/e5ccd477-88cd-4284-9de7-f336def1c7a1-kube-api-access-8jjjd\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.253621 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5ccd477-88cd-4284-9de7-f336def1c7a1-run-httpd\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.253645 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.253683 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trmx2\" (UniqueName: \"kubernetes.io/projected/02a921c8-6579-451b-beaf-9832cf900668-kube-api-access-trmx2\") pod \"cinder-db-sync-f9zkj\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.253727 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42bnj\" (UniqueName: \"kubernetes.io/projected/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-kube-api-access-42bnj\") pod \"horizon-d659f57fc-rp4h6\" (UID: \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\") " pod="openstack/horizon-d659f57fc-rp4h6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.253750 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/02a921c8-6579-451b-beaf-9832cf900668-etc-machine-id\") pod \"cinder-db-sync-f9zkj\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.253797 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-logs\") pod \"horizon-d659f57fc-rp4h6\" (UID: \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\") " pod="openstack/horizon-d659f57fc-rp4h6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.253824 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-db-sync-config-data\") pod \"cinder-db-sync-f9zkj\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.253847 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-scripts\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.253880 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-horizon-secret-key\") pod \"horizon-d659f57fc-rp4h6\" (UID: \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\") " pod="openstack/horizon-d659f57fc-rp4h6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.253911 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-scripts\") pod \"cinder-db-sync-f9zkj\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.253933 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-config-data\") pod \"cinder-db-sync-f9zkj\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.253964 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.253991 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-config-data\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.254017 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-config-data\") pod \"horizon-d659f57fc-rp4h6\" (UID: \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\") " pod="openstack/horizon-d659f57fc-rp4h6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.254041 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-scripts\") pod \"horizon-d659f57fc-rp4h6\" (UID: \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\") " pod="openstack/horizon-d659f57fc-rp4h6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.255184 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-logs\") pod \"horizon-d659f57fc-rp4h6\" (UID: \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\") " pod="openstack/horizon-d659f57fc-rp4h6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.258025 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-config-data\") pod \"horizon-d659f57fc-rp4h6\" (UID: \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\") " pod="openstack/horizon-d659f57fc-rp4h6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.260418 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-scripts\") pod \"horizon-d659f57fc-rp4h6\" (UID: \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\") " pod="openstack/horizon-d659f57fc-rp4h6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.260758 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/02a921c8-6579-451b-beaf-9832cf900668-etc-machine-id\") pod \"cinder-db-sync-f9zkj\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.261423 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-db-sync-config-data\") pod \"cinder-db-sync-f9zkj\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.262946 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-horizon-secret-key\") pod \"horizon-d659f57fc-rp4h6\" (UID: \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\") " pod="openstack/horizon-d659f57fc-rp4h6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.268644 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-scripts\") pod \"cinder-db-sync-f9zkj\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.269120 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-combined-ca-bundle\") pod \"cinder-db-sync-f9zkj\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.302666 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-config-data\") pod \"cinder-db-sync-f9zkj\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.314434 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trmx2\" (UniqueName: \"kubernetes.io/projected/02a921c8-6579-451b-beaf-9832cf900668-kube-api-access-trmx2\") pod \"cinder-db-sync-f9zkj\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.314554 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42bnj\" (UniqueName: \"kubernetes.io/projected/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-kube-api-access-42bnj\") pod \"horizon-d659f57fc-rp4h6\" (UID: \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\") " pod="openstack/horizon-d659f57fc-rp4h6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.337859 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-jltn7"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.340616 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jltn7" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.355558 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.355888 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.356057 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-mckmx" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.357107 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-scripts\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.366102 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-scripts\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.369527 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.369606 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-config-data\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.369728 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5ccd477-88cd-4284-9de7-f336def1c7a1-log-httpd\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.369824 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jjjd\" (UniqueName: \"kubernetes.io/projected/e5ccd477-88cd-4284-9de7-f336def1c7a1-kube-api-access-8jjjd\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.369847 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5ccd477-88cd-4284-9de7-f336def1c7a1-run-httpd\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.369875 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.372538 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5ccd477-88cd-4284-9de7-f336def1c7a1-log-httpd\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.373362 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5ccd477-88cd-4284-9de7-f336def1c7a1-run-httpd\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.378044 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.380019 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5c54d4859c-6cf2w"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.380837 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.382701 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c54d4859c-6cf2w" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.389519 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-config-data\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.397659 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-jz9x9"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.398729 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-jz9x9" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.406854 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-jltn7"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.432098 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-jz9x9"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.432359 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.436037 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-6zhqd" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.449458 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jjjd\" (UniqueName: \"kubernetes.io/projected/e5ccd477-88cd-4284-9de7-f336def1c7a1-kube-api-access-8jjjd\") pod \"ceilometer-0\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.452392 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5c54d4859c-6cf2w"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.463054 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.464777 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.469641 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.469820 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-r5s28" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.469929 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.470126 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.471009 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/429f2d90-393a-4205-9597-4a1d92dd15be-scripts\") pod \"horizon-5c54d4859c-6cf2w\" (UID: \"429f2d90-393a-4205-9597-4a1d92dd15be\") " pod="openstack/horizon-5c54d4859c-6cf2w" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.471045 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v577c\" (UniqueName: \"kubernetes.io/projected/19dd0c13-b898-4147-ae5f-cbc5d4915910-kube-api-access-v577c\") pod \"barbican-db-sync-jz9x9\" (UID: \"19dd0c13-b898-4147-ae5f-cbc5d4915910\") " pod="openstack/barbican-db-sync-jz9x9" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.471064 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/429f2d90-393a-4205-9597-4a1d92dd15be-logs\") pod \"horizon-5c54d4859c-6cf2w\" (UID: \"429f2d90-393a-4205-9597-4a1d92dd15be\") " pod="openstack/horizon-5c54d4859c-6cf2w" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.471084 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19dd0c13-b898-4147-ae5f-cbc5d4915910-combined-ca-bundle\") pod \"barbican-db-sync-jz9x9\" (UID: \"19dd0c13-b898-4147-ae5f-cbc5d4915910\") " pod="openstack/barbican-db-sync-jz9x9" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.471112 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/429f2d90-393a-4205-9597-4a1d92dd15be-horizon-secret-key\") pod \"horizon-5c54d4859c-6cf2w\" (UID: \"429f2d90-393a-4205-9597-4a1d92dd15be\") " pod="openstack/horizon-5c54d4859c-6cf2w" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.471132 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f15102ce-82ca-49c8-a069-25469380b043-config\") pod \"neutron-db-sync-jltn7\" (UID: \"f15102ce-82ca-49c8-a069-25469380b043\") " pod="openstack/neutron-db-sync-jltn7" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.471158 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f15102ce-82ca-49c8-a069-25469380b043-combined-ca-bundle\") pod \"neutron-db-sync-jltn7\" (UID: \"f15102ce-82ca-49c8-a069-25469380b043\") " pod="openstack/neutron-db-sync-jltn7" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.471216 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/429f2d90-393a-4205-9597-4a1d92dd15be-config-data\") pod \"horizon-5c54d4859c-6cf2w\" (UID: \"429f2d90-393a-4205-9597-4a1d92dd15be\") " pod="openstack/horizon-5c54d4859c-6cf2w" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.471237 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58rdh\" (UniqueName: \"kubernetes.io/projected/f15102ce-82ca-49c8-a069-25469380b043-kube-api-access-58rdh\") pod \"neutron-db-sync-jltn7\" (UID: \"f15102ce-82ca-49c8-a069-25469380b043\") " pod="openstack/neutron-db-sync-jltn7" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.471256 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/19dd0c13-b898-4147-ae5f-cbc5d4915910-db-sync-config-data\") pod \"barbican-db-sync-jz9x9\" (UID: \"19dd0c13-b898-4147-ae5f-cbc5d4915910\") " pod="openstack/barbican-db-sync-jz9x9" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.471284 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f28sx\" (UniqueName: \"kubernetes.io/projected/429f2d90-393a-4205-9597-4a1d92dd15be-kube-api-access-f28sx\") pod \"horizon-5c54d4859c-6cf2w\" (UID: \"429f2d90-393a-4205-9597-4a1d92dd15be\") " pod="openstack/horizon-5c54d4859c-6cf2w" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.473477 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-m5j4j"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.480882 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.496601 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.498153 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.506374 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-xf9m6"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.507586 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xf9m6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.514254 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-mtwfj"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.515595 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.522829 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.523377 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-dr6jm" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.523506 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.523793 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.523911 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.530010 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.558276 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-xf9m6"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.560094 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.585176 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.585237 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/429f2d90-393a-4205-9597-4a1d92dd15be-config-data\") pod \"horizon-5c54d4859c-6cf2w\" (UID: \"429f2d90-393a-4205-9597-4a1d92dd15be\") " pod="openstack/horizon-5c54d4859c-6cf2w" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.585260 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-scripts\") pod \"placement-db-sync-xf9m6\" (UID: \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\") " pod="openstack/placement-db-sync-xf9m6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.585279 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e06838c2-047e-4746-bb20-735a1eb9cb37-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.585306 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58rdh\" (UniqueName: \"kubernetes.io/projected/f15102ce-82ca-49c8-a069-25469380b043-kube-api-access-58rdh\") pod \"neutron-db-sync-jltn7\" (UID: \"f15102ce-82ca-49c8-a069-25469380b043\") " pod="openstack/neutron-db-sync-jltn7" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.585326 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.585352 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/19dd0c13-b898-4147-ae5f-cbc5d4915910-db-sync-config-data\") pod \"barbican-db-sync-jz9x9\" (UID: \"19dd0c13-b898-4147-ae5f-cbc5d4915910\") " pod="openstack/barbican-db-sync-jz9x9" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.585382 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f28sx\" (UniqueName: \"kubernetes.io/projected/429f2d90-393a-4205-9597-4a1d92dd15be-kube-api-access-f28sx\") pod \"horizon-5c54d4859c-6cf2w\" (UID: \"429f2d90-393a-4205-9597-4a1d92dd15be\") " pod="openstack/horizon-5c54d4859c-6cf2w" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.585406 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-mtwfj\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.585429 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.585447 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-scripts\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.585466 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ae662df3-8898-4509-b820-2a918ad3ad7a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.585482 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.585496 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-logs\") pod \"placement-db-sync-xf9m6\" (UID: \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\") " pod="openstack/placement-db-sync-xf9m6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.585516 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-mtwfj\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.585534 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bfbg\" (UniqueName: \"kubernetes.io/projected/1fa3f342-a062-421d-8c06-f53468a8db00-kube-api-access-9bfbg\") pod \"dnsmasq-dns-56df8fb6b7-mtwfj\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.585549 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.585566 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.585582 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.586296 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-mtwfj\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.586333 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-config-data\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.586702 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcv6c\" (UniqueName: \"kubernetes.io/projected/e06838c2-047e-4746-bb20-735a1eb9cb37-kube-api-access-lcv6c\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.586725 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/429f2d90-393a-4205-9597-4a1d92dd15be-scripts\") pod \"horizon-5c54d4859c-6cf2w\" (UID: \"429f2d90-393a-4205-9597-4a1d92dd15be\") " pod="openstack/horizon-5c54d4859c-6cf2w" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.587456 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/429f2d90-393a-4205-9597-4a1d92dd15be-scripts\") pod \"horizon-5c54d4859c-6cf2w\" (UID: \"429f2d90-393a-4205-9597-4a1d92dd15be\") " pod="openstack/horizon-5c54d4859c-6cf2w" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.587492 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v577c\" (UniqueName: \"kubernetes.io/projected/19dd0c13-b898-4147-ae5f-cbc5d4915910-kube-api-access-v577c\") pod \"barbican-db-sync-jz9x9\" (UID: \"19dd0c13-b898-4147-ae5f-cbc5d4915910\") " pod="openstack/barbican-db-sync-jz9x9" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.587678 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/429f2d90-393a-4205-9597-4a1d92dd15be-logs\") pod \"horizon-5c54d4859c-6cf2w\" (UID: \"429f2d90-393a-4205-9597-4a1d92dd15be\") " pod="openstack/horizon-5c54d4859c-6cf2w" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.587804 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/429f2d90-393a-4205-9597-4a1d92dd15be-config-data\") pod \"horizon-5c54d4859c-6cf2w\" (UID: \"429f2d90-393a-4205-9597-4a1d92dd15be\") " pod="openstack/horizon-5c54d4859c-6cf2w" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.587906 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/429f2d90-393a-4205-9597-4a1d92dd15be-logs\") pod \"horizon-5c54d4859c-6cf2w\" (UID: \"429f2d90-393a-4205-9597-4a1d92dd15be\") " pod="openstack/horizon-5c54d4859c-6cf2w" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.587939 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19dd0c13-b898-4147-ae5f-cbc5d4915910-combined-ca-bundle\") pod \"barbican-db-sync-jz9x9\" (UID: \"19dd0c13-b898-4147-ae5f-cbc5d4915910\") " pod="openstack/barbican-db-sync-jz9x9" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.588322 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-mtwfj\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.588427 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/429f2d90-393a-4205-9597-4a1d92dd15be-horizon-secret-key\") pod \"horizon-5c54d4859c-6cf2w\" (UID: \"429f2d90-393a-4205-9597-4a1d92dd15be\") " pod="openstack/horizon-5c54d4859c-6cf2w" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.588587 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f15102ce-82ca-49c8-a069-25469380b043-config\") pod \"neutron-db-sync-jltn7\" (UID: \"f15102ce-82ca-49c8-a069-25469380b043\") " pod="openstack/neutron-db-sync-jltn7" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.588959 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e06838c2-047e-4746-bb20-735a1eb9cb37-logs\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.588984 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-combined-ca-bundle\") pod \"placement-db-sync-xf9m6\" (UID: \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\") " pod="openstack/placement-db-sync-xf9m6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.589008 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snn6s\" (UniqueName: \"kubernetes.io/projected/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-kube-api-access-snn6s\") pod \"placement-db-sync-xf9m6\" (UID: \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\") " pod="openstack/placement-db-sync-xf9m6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.589036 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.589059 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9lw6\" (UniqueName: \"kubernetes.io/projected/ae662df3-8898-4509-b820-2a918ad3ad7a-kube-api-access-f9lw6\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.589086 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f15102ce-82ca-49c8-a069-25469380b043-combined-ca-bundle\") pod \"neutron-db-sync-jltn7\" (UID: \"f15102ce-82ca-49c8-a069-25469380b043\") " pod="openstack/neutron-db-sync-jltn7" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.589103 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae662df3-8898-4509-b820-2a918ad3ad7a-logs\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.589130 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-config\") pod \"dnsmasq-dns-56df8fb6b7-mtwfj\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.589152 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-config-data\") pod \"placement-db-sync-xf9m6\" (UID: \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\") " pod="openstack/placement-db-sync-xf9m6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.590582 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-mtwfj"] Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.590968 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-d659f57fc-rp4h6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.603727 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f15102ce-82ca-49c8-a069-25469380b043-combined-ca-bundle\") pod \"neutron-db-sync-jltn7\" (UID: \"f15102ce-82ca-49c8-a069-25469380b043\") " pod="openstack/neutron-db-sync-jltn7" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.685749 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f28sx\" (UniqueName: \"kubernetes.io/projected/429f2d90-393a-4205-9597-4a1d92dd15be-kube-api-access-f28sx\") pod \"horizon-5c54d4859c-6cf2w\" (UID: \"429f2d90-393a-4205-9597-4a1d92dd15be\") " pod="openstack/horizon-5c54d4859c-6cf2w" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.687711 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f15102ce-82ca-49c8-a069-25469380b043-config\") pod \"neutron-db-sync-jltn7\" (UID: \"f15102ce-82ca-49c8-a069-25469380b043\") " pod="openstack/neutron-db-sync-jltn7" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.687965 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19dd0c13-b898-4147-ae5f-cbc5d4915910-combined-ca-bundle\") pod \"barbican-db-sync-jz9x9\" (UID: \"19dd0c13-b898-4147-ae5f-cbc5d4915910\") " pod="openstack/barbican-db-sync-jz9x9" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.688059 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/19dd0c13-b898-4147-ae5f-cbc5d4915910-db-sync-config-data\") pod \"barbican-db-sync-jz9x9\" (UID: \"19dd0c13-b898-4147-ae5f-cbc5d4915910\") " pod="openstack/barbican-db-sync-jz9x9" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.688384 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/429f2d90-393a-4205-9597-4a1d92dd15be-horizon-secret-key\") pod \"horizon-5c54d4859c-6cf2w\" (UID: \"429f2d90-393a-4205-9597-4a1d92dd15be\") " pod="openstack/horizon-5c54d4859c-6cf2w" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.688799 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v577c\" (UniqueName: \"kubernetes.io/projected/19dd0c13-b898-4147-ae5f-cbc5d4915910-kube-api-access-v577c\") pod \"barbican-db-sync-jz9x9\" (UID: \"19dd0c13-b898-4147-ae5f-cbc5d4915910\") " pod="openstack/barbican-db-sync-jz9x9" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.691332 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58rdh\" (UniqueName: \"kubernetes.io/projected/f15102ce-82ca-49c8-a069-25469380b043-kube-api-access-58rdh\") pod \"neutron-db-sync-jltn7\" (UID: \"f15102ce-82ca-49c8-a069-25469380b043\") " pod="openstack/neutron-db-sync-jltn7" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.694186 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jltn7" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698386 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcv6c\" (UniqueName: \"kubernetes.io/projected/e06838c2-047e-4746-bb20-735a1eb9cb37-kube-api-access-lcv6c\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698451 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-mtwfj\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698494 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e06838c2-047e-4746-bb20-735a1eb9cb37-logs\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698518 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-combined-ca-bundle\") pod \"placement-db-sync-xf9m6\" (UID: \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\") " pod="openstack/placement-db-sync-xf9m6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698544 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snn6s\" (UniqueName: \"kubernetes.io/projected/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-kube-api-access-snn6s\") pod \"placement-db-sync-xf9m6\" (UID: \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\") " pod="openstack/placement-db-sync-xf9m6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698572 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698599 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9lw6\" (UniqueName: \"kubernetes.io/projected/ae662df3-8898-4509-b820-2a918ad3ad7a-kube-api-access-f9lw6\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698631 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae662df3-8898-4509-b820-2a918ad3ad7a-logs\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698659 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-config\") pod \"dnsmasq-dns-56df8fb6b7-mtwfj\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698681 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-config-data\") pod \"placement-db-sync-xf9m6\" (UID: \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\") " pod="openstack/placement-db-sync-xf9m6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698700 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698721 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-scripts\") pod \"placement-db-sync-xf9m6\" (UID: \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\") " pod="openstack/placement-db-sync-xf9m6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698742 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e06838c2-047e-4746-bb20-735a1eb9cb37-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698767 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698795 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-mtwfj\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698814 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698834 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-scripts\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698859 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ae662df3-8898-4509-b820-2a918ad3ad7a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698879 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698895 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-logs\") pod \"placement-db-sync-xf9m6\" (UID: \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\") " pod="openstack/placement-db-sync-xf9m6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698918 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-mtwfj\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698967 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bfbg\" (UniqueName: \"kubernetes.io/projected/1fa3f342-a062-421d-8c06-f53468a8db00-kube-api-access-9bfbg\") pod \"dnsmasq-dns-56df8fb6b7-mtwfj\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.698993 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.699023 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.699046 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.699090 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-mtwfj\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.699116 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-config-data\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.715163 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e06838c2-047e-4746-bb20-735a1eb9cb37-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.716272 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e06838c2-047e-4746-bb20-735a1eb9cb37-logs\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.717844 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-logs\") pod \"placement-db-sync-xf9m6\" (UID: \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\") " pod="openstack/placement-db-sync-xf9m6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.718550 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-mtwfj\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.719515 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c54d4859c-6cf2w" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.720718 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae662df3-8898-4509-b820-2a918ad3ad7a-logs\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.721583 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-config\") pod \"dnsmasq-dns-56df8fb6b7-mtwfj\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.736188 4804 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.750641 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcv6c\" (UniqueName: \"kubernetes.io/projected/e06838c2-047e-4746-bb20-735a1eb9cb37-kube-api-access-lcv6c\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.750668 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-mtwfj\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.751378 4804 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.752750 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-mtwfj\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.754689 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-config-data\") pod \"placement-db-sync-xf9m6\" (UID: \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\") " pod="openstack/placement-db-sync-xf9m6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.756855 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-combined-ca-bundle\") pod \"placement-db-sync-xf9m6\" (UID: \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\") " pod="openstack/placement-db-sync-xf9m6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.759168 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9lw6\" (UniqueName: \"kubernetes.io/projected/ae662df3-8898-4509-b820-2a918ad3ad7a-kube-api-access-f9lw6\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.766873 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.782346 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.782866 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.784470 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-jz9x9" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.786378 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-config-data\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.786774 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-scripts\") pod \"placement-db-sync-xf9m6\" (UID: \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\") " pod="openstack/placement-db-sync-xf9m6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.786867 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.786938 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ae662df3-8898-4509-b820-2a918ad3ad7a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.793326 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-mtwfj\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.797579 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.822192 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snn6s\" (UniqueName: \"kubernetes.io/projected/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-kube-api-access-snn6s\") pod \"placement-db-sync-xf9m6\" (UID: \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\") " pod="openstack/placement-db-sync-xf9m6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.822229 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-scripts\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.822799 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.823341 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.825408 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.826638 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bfbg\" (UniqueName: \"kubernetes.io/projected/1fa3f342-a062-421d-8c06-f53468a8db00-kube-api-access-9bfbg\") pod \"dnsmasq-dns-56df8fb6b7-mtwfj\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.833651 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.835773 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.835838 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.839714 4804 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.848037 4804 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ea496523757895e32a1aaa278701aae787ee0314e5bf63e36e8c688fc2dbc0d7"} pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.848155 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" containerID="cri-o://ea496523757895e32a1aaa278701aae787ee0314e5bf63e36e8c688fc2dbc0d7" gracePeriod=600 Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.848427 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.902935 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xf9m6" Feb 17 13:46:25 crc kubenswrapper[4804]: I0217 13:46:25.961515 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-m5j4j"] Feb 17 13:46:26 crc kubenswrapper[4804]: I0217 13:46:26.030007 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:26 crc kubenswrapper[4804]: I0217 13:46:26.093735 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 13:46:26 crc kubenswrapper[4804]: I0217 13:46:26.161350 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-kmbrx"] Feb 17 13:46:26 crc kubenswrapper[4804]: W0217 13:46:26.454343 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02a921c8_6579_451b_beaf_9832cf900668.slice/crio-ef91adb98631667de4f24978b6722ac67d6e2cf414b43820776723b677addd2c WatchSource:0}: Error finding container ef91adb98631667de4f24978b6722ac67d6e2cf414b43820776723b677addd2c: Status 404 returned error can't find the container with id ef91adb98631667de4f24978b6722ac67d6e2cf414b43820776723b677addd2c Feb 17 13:46:26 crc kubenswrapper[4804]: I0217 13:46:26.455518 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-f9zkj"] Feb 17 13:46:26 crc kubenswrapper[4804]: I0217 13:46:26.487006 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5c54d4859c-6cf2w"] Feb 17 13:46:26 crc kubenswrapper[4804]: W0217 13:46:26.488272 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod429f2d90_393a_4205_9597_4a1d92dd15be.slice/crio-fce38230f6cfdb1a15c8055882ba0aa717f46a7acf9c7b94e71d76d6f0c3a6ff WatchSource:0}: Error finding container fce38230f6cfdb1a15c8055882ba0aa717f46a7acf9c7b94e71d76d6f0c3a6ff: Status 404 returned error can't find the container with id fce38230f6cfdb1a15c8055882ba0aa717f46a7acf9c7b94e71d76d6f0c3a6ff Feb 17 13:46:26 crc kubenswrapper[4804]: I0217 13:46:26.569460 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kmbrx" event={"ID":"53073bd8-b356-4cb8-a190-db417f233b63","Type":"ContainerStarted","Data":"22cc18bf0c3204362054a8f4e626573eec2da03d934e88db8cdb3c72fda9d1e5"} Feb 17 13:46:26 crc kubenswrapper[4804]: I0217 13:46:26.572333 4804 generic.go:334] "Generic (PLEG): container finished" podID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerID="ea496523757895e32a1aaa278701aae787ee0314e5bf63e36e8c688fc2dbc0d7" exitCode=0 Feb 17 13:46:26 crc kubenswrapper[4804]: I0217 13:46:26.572367 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerDied","Data":"ea496523757895e32a1aaa278701aae787ee0314e5bf63e36e8c688fc2dbc0d7"} Feb 17 13:46:26 crc kubenswrapper[4804]: I0217 13:46:26.572434 4804 scope.go:117] "RemoveContainer" containerID="0320866c2bb2dbd13ef711a6f5701e23927765988c8998787dbdeb879aaaaa69" Feb 17 13:46:26 crc kubenswrapper[4804]: I0217 13:46:26.584972 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c54d4859c-6cf2w" event={"ID":"429f2d90-393a-4205-9597-4a1d92dd15be","Type":"ContainerStarted","Data":"fce38230f6cfdb1a15c8055882ba0aa717f46a7acf9c7b94e71d76d6f0c3a6ff"} Feb 17 13:46:26 crc kubenswrapper[4804]: I0217 13:46:26.585003 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-f9zkj" event={"ID":"02a921c8-6579-451b-beaf-9832cf900668","Type":"ContainerStarted","Data":"ef91adb98631667de4f24978b6722ac67d6e2cf414b43820776723b677addd2c"} Feb 17 13:46:26 crc kubenswrapper[4804]: I0217 13:46:26.585014 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" event={"ID":"c8639d8d-c367-40a9-b26c-c7c301b82609","Type":"ContainerStarted","Data":"46aadc0bc9a1c8319216c31393e9e2b8f0ba23fa61327f5c2056cca1df2f582b"} Feb 17 13:46:26 crc kubenswrapper[4804]: I0217 13:46:26.661552 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-d659f57fc-rp4h6"] Feb 17 13:46:26 crc kubenswrapper[4804]: W0217 13:46:26.718496 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf15102ce_82ca_49c8_a069_25469380b043.slice/crio-87611d75a6337c3c8516a80920686721f7c8d0bd90cc4c5bcd9d5239128f8f14 WatchSource:0}: Error finding container 87611d75a6337c3c8516a80920686721f7c8d0bd90cc4c5bcd9d5239128f8f14: Status 404 returned error can't find the container with id 87611d75a6337c3c8516a80920686721f7c8d0bd90cc4c5bcd9d5239128f8f14 Feb 17 13:46:26 crc kubenswrapper[4804]: I0217 13:46:26.721342 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-jltn7"] Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.066837 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.087921 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-mtwfj"] Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.096384 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-xf9m6"] Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.105476 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-jz9x9"] Feb 17 13:46:27 crc kubenswrapper[4804]: W0217 13:46:27.115555 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fa3f342_a062_421d_8c06_f53468a8db00.slice/crio-cdb7f4453bccc68342ae31db3bbfc987aaa5d47b283d10e4a4bd0daebe7bbf50 WatchSource:0}: Error finding container cdb7f4453bccc68342ae31db3bbfc987aaa5d47b283d10e4a4bd0daebe7bbf50: Status 404 returned error can't find the container with id cdb7f4453bccc68342ae31db3bbfc987aaa5d47b283d10e4a4bd0daebe7bbf50 Feb 17 13:46:27 crc kubenswrapper[4804]: W0217 13:46:27.132743 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19dd0c13_b898_4147_ae5f_cbc5d4915910.slice/crio-3d2919ea140840d1d3c9f9481431391d06220a9f53b8b9586d077d9974ea9874 WatchSource:0}: Error finding container 3d2919ea140840d1d3c9f9481431391d06220a9f53b8b9586d077d9974ea9874: Status 404 returned error can't find the container with id 3d2919ea140840d1d3c9f9481431391d06220a9f53b8b9586d077d9974ea9874 Feb 17 13:46:27 crc kubenswrapper[4804]: W0217 13:46:27.170369 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14e1fc7b_0e6c_4377_b4e0_74e77e951b0d.slice/crio-ff3a70443dcb1450d10696c24e8f78a38b035a2406676a888b6d8c4f0796ab75 WatchSource:0}: Error finding container ff3a70443dcb1450d10696c24e8f78a38b035a2406676a888b6d8c4f0796ab75: Status 404 returned error can't find the container with id ff3a70443dcb1450d10696c24e8f78a38b035a2406676a888b6d8c4f0796ab75 Feb 17 13:46:27 crc kubenswrapper[4804]: W0217 13:46:27.186094 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5ccd477_88cd_4284_9de7_f336def1c7a1.slice/crio-9ab7cd127419d840e73931cd84e8a62cca6dbdb1c678768ac1433e7970f3f9a0 WatchSource:0}: Error finding container 9ab7cd127419d840e73931cd84e8a62cca6dbdb1c678768ac1433e7970f3f9a0: Status 404 returned error can't find the container with id 9ab7cd127419d840e73931cd84e8a62cca6dbdb1c678768ac1433e7970f3f9a0 Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.215855 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.268066 4804 scope.go:117] "RemoveContainer" containerID="936d92768f8545882fd9f589c352b0f3e05694fdb88b93635d612b3de2273f31" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.364831 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.379998 4804 scope.go:117] "RemoveContainer" containerID="4c6c05689b4d8003c577d2fa36fd3fe297914eaa29a6a636dc47b237ac9d795d" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.394836 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5c54d4859c-6cf2w"] Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.476079 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6b56868599-9s4h9"] Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.478892 4804 scope.go:117] "RemoveContainer" containerID="a58356e342b8d1a0c197b929d754c94eace180ca8295bdab19e683e521269b3f" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.480043 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b56868599-9s4h9" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.517497 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6b56868599-9s4h9"] Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.530414 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.557037 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.572321 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45pwj\" (UniqueName: \"kubernetes.io/projected/c40906b1-b78c-4e65-9c35-346626adeba3-kube-api-access-45pwj\") pod \"horizon-6b56868599-9s4h9\" (UID: \"c40906b1-b78c-4e65-9c35-346626adeba3\") " pod="openstack/horizon-6b56868599-9s4h9" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.572375 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c40906b1-b78c-4e65-9c35-346626adeba3-config-data\") pod \"horizon-6b56868599-9s4h9\" (UID: \"c40906b1-b78c-4e65-9c35-346626adeba3\") " pod="openstack/horizon-6b56868599-9s4h9" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.572434 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c40906b1-b78c-4e65-9c35-346626adeba3-logs\") pod \"horizon-6b56868599-9s4h9\" (UID: \"c40906b1-b78c-4e65-9c35-346626adeba3\") " pod="openstack/horizon-6b56868599-9s4h9" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.572491 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c40906b1-b78c-4e65-9c35-346626adeba3-scripts\") pod \"horizon-6b56868599-9s4h9\" (UID: \"c40906b1-b78c-4e65-9c35-346626adeba3\") " pod="openstack/horizon-6b56868599-9s4h9" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.572527 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c40906b1-b78c-4e65-9c35-346626adeba3-horizon-secret-key\") pod \"horizon-6b56868599-9s4h9\" (UID: \"c40906b1-b78c-4e65-9c35-346626adeba3\") " pod="openstack/horizon-6b56868599-9s4h9" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.630568 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kmbrx" event={"ID":"53073bd8-b356-4cb8-a190-db417f233b63","Type":"ContainerStarted","Data":"fe58294f85ff06a0d32971760c88b3a7d0ebe711d822c93e180307f22e74f6a0"} Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.645364 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5ccd477-88cd-4284-9de7-f336def1c7a1","Type":"ContainerStarted","Data":"9ab7cd127419d840e73931cd84e8a62cca6dbdb1c678768ac1433e7970f3f9a0"} Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.648550 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e06838c2-047e-4746-bb20-735a1eb9cb37","Type":"ContainerStarted","Data":"9b93e2a2a156279c02c6d5f10d5f5e53f3649c24aa9d244b7707ada20e287204"} Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.652018 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" event={"ID":"1fa3f342-a062-421d-8c06-f53468a8db00","Type":"ContainerStarted","Data":"cdb7f4453bccc68342ae31db3bbfc987aaa5d47b283d10e4a4bd0daebe7bbf50"} Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.658398 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-kmbrx" podStartSLOduration=3.658292 podStartE2EDuration="3.658292s" podCreationTimestamp="2026-02-17 13:46:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:46:27.648885835 +0000 UTC m=+1261.760305172" watchObservedRunningTime="2026-02-17 13:46:27.658292 +0000 UTC m=+1261.769711337" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.665880 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-jz9x9" event={"ID":"19dd0c13-b898-4147-ae5f-cbc5d4915910","Type":"ContainerStarted","Data":"3d2919ea140840d1d3c9f9481431391d06220a9f53b8b9586d077d9974ea9874"} Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.676499 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c40906b1-b78c-4e65-9c35-346626adeba3-horizon-secret-key\") pod \"horizon-6b56868599-9s4h9\" (UID: \"c40906b1-b78c-4e65-9c35-346626adeba3\") " pod="openstack/horizon-6b56868599-9s4h9" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.676577 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45pwj\" (UniqueName: \"kubernetes.io/projected/c40906b1-b78c-4e65-9c35-346626adeba3-kube-api-access-45pwj\") pod \"horizon-6b56868599-9s4h9\" (UID: \"c40906b1-b78c-4e65-9c35-346626adeba3\") " pod="openstack/horizon-6b56868599-9s4h9" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.676638 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c40906b1-b78c-4e65-9c35-346626adeba3-config-data\") pod \"horizon-6b56868599-9s4h9\" (UID: \"c40906b1-b78c-4e65-9c35-346626adeba3\") " pod="openstack/horizon-6b56868599-9s4h9" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.676697 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c40906b1-b78c-4e65-9c35-346626adeba3-logs\") pod \"horizon-6b56868599-9s4h9\" (UID: \"c40906b1-b78c-4e65-9c35-346626adeba3\") " pod="openstack/horizon-6b56868599-9s4h9" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.676806 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c40906b1-b78c-4e65-9c35-346626adeba3-scripts\") pod \"horizon-6b56868599-9s4h9\" (UID: \"c40906b1-b78c-4e65-9c35-346626adeba3\") " pod="openstack/horizon-6b56868599-9s4h9" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.677629 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c40906b1-b78c-4e65-9c35-346626adeba3-scripts\") pod \"horizon-6b56868599-9s4h9\" (UID: \"c40906b1-b78c-4e65-9c35-346626adeba3\") " pod="openstack/horizon-6b56868599-9s4h9" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.678550 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c40906b1-b78c-4e65-9c35-346626adeba3-config-data\") pod \"horizon-6b56868599-9s4h9\" (UID: \"c40906b1-b78c-4e65-9c35-346626adeba3\") " pod="openstack/horizon-6b56868599-9s4h9" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.678752 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c40906b1-b78c-4e65-9c35-346626adeba3-logs\") pod \"horizon-6b56868599-9s4h9\" (UID: \"c40906b1-b78c-4e65-9c35-346626adeba3\") " pod="openstack/horizon-6b56868599-9s4h9" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.691284 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerStarted","Data":"845afb2d1d32c8f1c4420bf9c6d30ae92d7fd53810dea6b094c1e266f88044e6"} Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.699757 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c40906b1-b78c-4e65-9c35-346626adeba3-horizon-secret-key\") pod \"horizon-6b56868599-9s4h9\" (UID: \"c40906b1-b78c-4e65-9c35-346626adeba3\") " pod="openstack/horizon-6b56868599-9s4h9" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.702923 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45pwj\" (UniqueName: \"kubernetes.io/projected/c40906b1-b78c-4e65-9c35-346626adeba3-kube-api-access-45pwj\") pod \"horizon-6b56868599-9s4h9\" (UID: \"c40906b1-b78c-4e65-9c35-346626adeba3\") " pod="openstack/horizon-6b56868599-9s4h9" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.703492 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-d659f57fc-rp4h6" event={"ID":"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9","Type":"ContainerStarted","Data":"38398873d4e244754b25fb3ccf4b8d269e3c191649fc54d1a45b9951042bdc7f"} Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.705814 4804 generic.go:334] "Generic (PLEG): container finished" podID="c8639d8d-c367-40a9-b26c-c7c301b82609" containerID="83533c9dfdbc39c545b70abc4b708d58af7e792f59972297b23faa02fcbb40b8" exitCode=0 Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.705862 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" event={"ID":"c8639d8d-c367-40a9-b26c-c7c301b82609","Type":"ContainerDied","Data":"83533c9dfdbc39c545b70abc4b708d58af7e792f59972297b23faa02fcbb40b8"} Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.721067 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jltn7" event={"ID":"f15102ce-82ca-49c8-a069-25469380b043","Type":"ContainerStarted","Data":"4334f8c8c165dce79cf685c7b7ada0d4aa970effa853bf86402b0c64eaa765f2"} Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.721108 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jltn7" event={"ID":"f15102ce-82ca-49c8-a069-25469380b043","Type":"ContainerStarted","Data":"87611d75a6337c3c8516a80920686721f7c8d0bd90cc4c5bcd9d5239128f8f14"} Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.723468 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xf9m6" event={"ID":"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d","Type":"ContainerStarted","Data":"ff3a70443dcb1450d10696c24e8f78a38b035a2406676a888b6d8c4f0796ab75"} Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.753576 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-jltn7" podStartSLOduration=2.753554254 podStartE2EDuration="2.753554254s" podCreationTimestamp="2026-02-17 13:46:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:46:27.743054724 +0000 UTC m=+1261.854474061" watchObservedRunningTime="2026-02-17 13:46:27.753554254 +0000 UTC m=+1261.864973591" Feb 17 13:46:27 crc kubenswrapper[4804]: I0217 13:46:27.818512 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b56868599-9s4h9" Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.117463 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.187262 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prjx2\" (UniqueName: \"kubernetes.io/projected/c8639d8d-c367-40a9-b26c-c7c301b82609-kube-api-access-prjx2\") pod \"c8639d8d-c367-40a9-b26c-c7c301b82609\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.188287 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-ovsdbserver-sb\") pod \"c8639d8d-c367-40a9-b26c-c7c301b82609\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.188458 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-ovsdbserver-nb\") pod \"c8639d8d-c367-40a9-b26c-c7c301b82609\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.188512 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-dns-svc\") pod \"c8639d8d-c367-40a9-b26c-c7c301b82609\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.188566 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-config\") pod \"c8639d8d-c367-40a9-b26c-c7c301b82609\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.188605 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-dns-swift-storage-0\") pod \"c8639d8d-c367-40a9-b26c-c7c301b82609\" (UID: \"c8639d8d-c367-40a9-b26c-c7c301b82609\") " Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.226346 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8639d8d-c367-40a9-b26c-c7c301b82609-kube-api-access-prjx2" (OuterVolumeSpecName: "kube-api-access-prjx2") pod "c8639d8d-c367-40a9-b26c-c7c301b82609" (UID: "c8639d8d-c367-40a9-b26c-c7c301b82609"). InnerVolumeSpecName "kube-api-access-prjx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.233816 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c8639d8d-c367-40a9-b26c-c7c301b82609" (UID: "c8639d8d-c367-40a9-b26c-c7c301b82609"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.238543 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c8639d8d-c367-40a9-b26c-c7c301b82609" (UID: "c8639d8d-c367-40a9-b26c-c7c301b82609"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.242829 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c8639d8d-c367-40a9-b26c-c7c301b82609" (UID: "c8639d8d-c367-40a9-b26c-c7c301b82609"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.244169 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c8639d8d-c367-40a9-b26c-c7c301b82609" (UID: "c8639d8d-c367-40a9-b26c-c7c301b82609"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.250603 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-config" (OuterVolumeSpecName: "config") pod "c8639d8d-c367-40a9-b26c-c7c301b82609" (UID: "c8639d8d-c367-40a9-b26c-c7c301b82609"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.264833 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.292374 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.292405 4804 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.292416 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prjx2\" (UniqueName: \"kubernetes.io/projected/c8639d8d-c367-40a9-b26c-c7c301b82609-kube-api-access-prjx2\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.292425 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.292435 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.293094 4804 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8639d8d-c367-40a9-b26c-c7c301b82609-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.492808 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6b56868599-9s4h9"] Feb 17 13:46:28 crc kubenswrapper[4804]: W0217 13:46:28.553218 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc40906b1_b78c_4e65_9c35_346626adeba3.slice/crio-0364542f4068f261d045963e550e809dad4e8639c09b095d8e25392e8b58c1ac WatchSource:0}: Error finding container 0364542f4068f261d045963e550e809dad4e8639c09b095d8e25392e8b58c1ac: Status 404 returned error can't find the container with id 0364542f4068f261d045963e550e809dad4e8639c09b095d8e25392e8b58c1ac Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.771270 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" event={"ID":"c8639d8d-c367-40a9-b26c-c7c301b82609","Type":"ContainerDied","Data":"46aadc0bc9a1c8319216c31393e9e2b8f0ba23fa61327f5c2056cca1df2f582b"} Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.771683 4804 scope.go:117] "RemoveContainer" containerID="83533c9dfdbc39c545b70abc4b708d58af7e792f59972297b23faa02fcbb40b8" Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.771576 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-m5j4j" Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.777829 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e06838c2-047e-4746-bb20-735a1eb9cb37","Type":"ContainerStarted","Data":"08670815fbd1e9b758c348fb93fa962c48c717b87693dc123c304e7d0ec4d4cd"} Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.780126 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b56868599-9s4h9" event={"ID":"c40906b1-b78c-4e65-9c35-346626adeba3","Type":"ContainerStarted","Data":"0364542f4068f261d045963e550e809dad4e8639c09b095d8e25392e8b58c1ac"} Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.783356 4804 generic.go:334] "Generic (PLEG): container finished" podID="1fa3f342-a062-421d-8c06-f53468a8db00" containerID="63be9f06e01e3909b7ff94ea9b177c0a528139e2942719322a381a426d4f2574" exitCode=0 Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.785319 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" event={"ID":"1fa3f342-a062-421d-8c06-f53468a8db00","Type":"ContainerDied","Data":"63be9f06e01e3909b7ff94ea9b177c0a528139e2942719322a381a426d4f2574"} Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.787964 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ae662df3-8898-4509-b820-2a918ad3ad7a","Type":"ContainerStarted","Data":"49f64571d9f610637811cf86586750a6d15c78928db1891dc4609e905bc4b08c"} Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.831339 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-m5j4j"] Feb 17 13:46:28 crc kubenswrapper[4804]: I0217 13:46:28.847731 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-m5j4j"] Feb 17 13:46:29 crc kubenswrapper[4804]: I0217 13:46:29.863608 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" event={"ID":"1fa3f342-a062-421d-8c06-f53468a8db00","Type":"ContainerStarted","Data":"b9a3f395e90e39b7c24df35dd6e3f0dd7e4bcbc43cd3d4f5483755287749ca41"} Feb 17 13:46:29 crc kubenswrapper[4804]: I0217 13:46:29.864632 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:29 crc kubenswrapper[4804]: I0217 13:46:29.875339 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ae662df3-8898-4509-b820-2a918ad3ad7a","Type":"ContainerStarted","Data":"dca62d14dda6868e926b57148e8cd74b64e632384abd99e1788d3d27c22c4765"} Feb 17 13:46:29 crc kubenswrapper[4804]: I0217 13:46:29.892798 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" podStartSLOduration=4.892770547 podStartE2EDuration="4.892770547s" podCreationTimestamp="2026-02-17 13:46:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:46:29.885111267 +0000 UTC m=+1263.996530614" watchObservedRunningTime="2026-02-17 13:46:29.892770547 +0000 UTC m=+1264.004189874" Feb 17 13:46:30 crc kubenswrapper[4804]: I0217 13:46:30.593731 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8639d8d-c367-40a9-b26c-c7c301b82609" path="/var/lib/kubelet/pods/c8639d8d-c367-40a9-b26c-c7c301b82609/volumes" Feb 17 13:46:30 crc kubenswrapper[4804]: I0217 13:46:30.890822 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e06838c2-047e-4746-bb20-735a1eb9cb37" containerName="glance-log" containerID="cri-o://08670815fbd1e9b758c348fb93fa962c48c717b87693dc123c304e7d0ec4d4cd" gracePeriod=30 Feb 17 13:46:30 crc kubenswrapper[4804]: I0217 13:46:30.891110 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e06838c2-047e-4746-bb20-735a1eb9cb37","Type":"ContainerStarted","Data":"8c54c0fbf665a0f62da46d5e48fa201c7c07fe926f4bbb23290d21ea751f360e"} Feb 17 13:46:30 crc kubenswrapper[4804]: I0217 13:46:30.891422 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e06838c2-047e-4746-bb20-735a1eb9cb37" containerName="glance-httpd" containerID="cri-o://8c54c0fbf665a0f62da46d5e48fa201c7c07fe926f4bbb23290d21ea751f360e" gracePeriod=30 Feb 17 13:46:30 crc kubenswrapper[4804]: I0217 13:46:30.918733 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.918718167 podStartE2EDuration="5.918718167s" podCreationTimestamp="2026-02-17 13:46:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:46:30.913072459 +0000 UTC m=+1265.024491806" watchObservedRunningTime="2026-02-17 13:46:30.918718167 +0000 UTC m=+1265.030137504" Feb 17 13:46:31 crc kubenswrapper[4804]: I0217 13:46:31.906039 4804 generic.go:334] "Generic (PLEG): container finished" podID="53073bd8-b356-4cb8-a190-db417f233b63" containerID="fe58294f85ff06a0d32971760c88b3a7d0ebe711d822c93e180307f22e74f6a0" exitCode=0 Feb 17 13:46:31 crc kubenswrapper[4804]: I0217 13:46:31.906583 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kmbrx" event={"ID":"53073bd8-b356-4cb8-a190-db417f233b63","Type":"ContainerDied","Data":"fe58294f85ff06a0d32971760c88b3a7d0ebe711d822c93e180307f22e74f6a0"} Feb 17 13:46:31 crc kubenswrapper[4804]: I0217 13:46:31.916830 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ae662df3-8898-4509-b820-2a918ad3ad7a","Type":"ContainerStarted","Data":"8d6d4b8225dc05b2f8ac6fe66b04d57f0e324f2f754fb6ddc82de82d73688709"} Feb 17 13:46:31 crc kubenswrapper[4804]: I0217 13:46:31.917000 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ae662df3-8898-4509-b820-2a918ad3ad7a" containerName="glance-log" containerID="cri-o://dca62d14dda6868e926b57148e8cd74b64e632384abd99e1788d3d27c22c4765" gracePeriod=30 Feb 17 13:46:31 crc kubenswrapper[4804]: I0217 13:46:31.917292 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ae662df3-8898-4509-b820-2a918ad3ad7a" containerName="glance-httpd" containerID="cri-o://8d6d4b8225dc05b2f8ac6fe66b04d57f0e324f2f754fb6ddc82de82d73688709" gracePeriod=30 Feb 17 13:46:31 crc kubenswrapper[4804]: I0217 13:46:31.933011 4804 generic.go:334] "Generic (PLEG): container finished" podID="e06838c2-047e-4746-bb20-735a1eb9cb37" containerID="8c54c0fbf665a0f62da46d5e48fa201c7c07fe926f4bbb23290d21ea751f360e" exitCode=0 Feb 17 13:46:31 crc kubenswrapper[4804]: I0217 13:46:31.933052 4804 generic.go:334] "Generic (PLEG): container finished" podID="e06838c2-047e-4746-bb20-735a1eb9cb37" containerID="08670815fbd1e9b758c348fb93fa962c48c717b87693dc123c304e7d0ec4d4cd" exitCode=143 Feb 17 13:46:31 crc kubenswrapper[4804]: I0217 13:46:31.933081 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e06838c2-047e-4746-bb20-735a1eb9cb37","Type":"ContainerDied","Data":"8c54c0fbf665a0f62da46d5e48fa201c7c07fe926f4bbb23290d21ea751f360e"} Feb 17 13:46:31 crc kubenswrapper[4804]: I0217 13:46:31.933115 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e06838c2-047e-4746-bb20-735a1eb9cb37","Type":"ContainerDied","Data":"08670815fbd1e9b758c348fb93fa962c48c717b87693dc123c304e7d0ec4d4cd"} Feb 17 13:46:32 crc kubenswrapper[4804]: I0217 13:46:32.945657 4804 generic.go:334] "Generic (PLEG): container finished" podID="ae662df3-8898-4509-b820-2a918ad3ad7a" containerID="8d6d4b8225dc05b2f8ac6fe66b04d57f0e324f2f754fb6ddc82de82d73688709" exitCode=0 Feb 17 13:46:32 crc kubenswrapper[4804]: I0217 13:46:32.945942 4804 generic.go:334] "Generic (PLEG): container finished" podID="ae662df3-8898-4509-b820-2a918ad3ad7a" containerID="dca62d14dda6868e926b57148e8cd74b64e632384abd99e1788d3d27c22c4765" exitCode=143 Feb 17 13:46:32 crc kubenswrapper[4804]: I0217 13:46:32.945821 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ae662df3-8898-4509-b820-2a918ad3ad7a","Type":"ContainerDied","Data":"8d6d4b8225dc05b2f8ac6fe66b04d57f0e324f2f754fb6ddc82de82d73688709"} Feb 17 13:46:32 crc kubenswrapper[4804]: I0217 13:46:32.946182 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ae662df3-8898-4509-b820-2a918ad3ad7a","Type":"ContainerDied","Data":"dca62d14dda6868e926b57148e8cd74b64e632384abd99e1788d3d27c22c4765"} Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.616090 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.624573 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.654335 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.654311106 podStartE2EDuration="8.654311106s" podCreationTimestamp="2026-02-17 13:46:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:46:31.97438246 +0000 UTC m=+1266.085801807" watchObservedRunningTime="2026-02-17 13:46:33.654311106 +0000 UTC m=+1267.765730443" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.715734 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-combined-ca-bundle\") pod \"53073bd8-b356-4cb8-a190-db417f233b63\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.715792 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e06838c2-047e-4746-bb20-735a1eb9cb37-logs\") pod \"e06838c2-047e-4746-bb20-735a1eb9cb37\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.715827 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-combined-ca-bundle\") pod \"e06838c2-047e-4746-bb20-735a1eb9cb37\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.715868 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-scripts\") pod \"e06838c2-047e-4746-bb20-735a1eb9cb37\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.715899 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-public-tls-certs\") pod \"e06838c2-047e-4746-bb20-735a1eb9cb37\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.715946 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"e06838c2-047e-4746-bb20-735a1eb9cb37\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.715977 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-credential-keys\") pod \"53073bd8-b356-4cb8-a190-db417f233b63\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.715995 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcv6c\" (UniqueName: \"kubernetes.io/projected/e06838c2-047e-4746-bb20-735a1eb9cb37-kube-api-access-lcv6c\") pod \"e06838c2-047e-4746-bb20-735a1eb9cb37\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.716034 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-scripts\") pod \"53073bd8-b356-4cb8-a190-db417f233b63\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.716066 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-config-data\") pod \"53073bd8-b356-4cb8-a190-db417f233b63\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.716094 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-config-data\") pod \"e06838c2-047e-4746-bb20-735a1eb9cb37\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.716123 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-fernet-keys\") pod \"53073bd8-b356-4cb8-a190-db417f233b63\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.716157 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e06838c2-047e-4746-bb20-735a1eb9cb37-httpd-run\") pod \"e06838c2-047e-4746-bb20-735a1eb9cb37\" (UID: \"e06838c2-047e-4746-bb20-735a1eb9cb37\") " Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.716221 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2bf4\" (UniqueName: \"kubernetes.io/projected/53073bd8-b356-4cb8-a190-db417f233b63-kube-api-access-j2bf4\") pod \"53073bd8-b356-4cb8-a190-db417f233b63\" (UID: \"53073bd8-b356-4cb8-a190-db417f233b63\") " Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.717219 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e06838c2-047e-4746-bb20-735a1eb9cb37-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e06838c2-047e-4746-bb20-735a1eb9cb37" (UID: "e06838c2-047e-4746-bb20-735a1eb9cb37"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.729978 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-scripts" (OuterVolumeSpecName: "scripts") pod "e06838c2-047e-4746-bb20-735a1eb9cb37" (UID: "e06838c2-047e-4746-bb20-735a1eb9cb37"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.733572 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e06838c2-047e-4746-bb20-735a1eb9cb37-logs" (OuterVolumeSpecName: "logs") pod "e06838c2-047e-4746-bb20-735a1eb9cb37" (UID: "e06838c2-047e-4746-bb20-735a1eb9cb37"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.748465 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "e06838c2-047e-4746-bb20-735a1eb9cb37" (UID: "e06838c2-047e-4746-bb20-735a1eb9cb37"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.748637 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "53073bd8-b356-4cb8-a190-db417f233b63" (UID: "53073bd8-b356-4cb8-a190-db417f233b63"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.749308 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e06838c2-047e-4746-bb20-735a1eb9cb37-kube-api-access-lcv6c" (OuterVolumeSpecName: "kube-api-access-lcv6c") pod "e06838c2-047e-4746-bb20-735a1eb9cb37" (UID: "e06838c2-047e-4746-bb20-735a1eb9cb37"). InnerVolumeSpecName "kube-api-access-lcv6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.752650 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53073bd8-b356-4cb8-a190-db417f233b63-kube-api-access-j2bf4" (OuterVolumeSpecName: "kube-api-access-j2bf4") pod "53073bd8-b356-4cb8-a190-db417f233b63" (UID: "53073bd8-b356-4cb8-a190-db417f233b63"). InnerVolumeSpecName "kube-api-access-j2bf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.755527 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "53073bd8-b356-4cb8-a190-db417f233b63" (UID: "53073bd8-b356-4cb8-a190-db417f233b63"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.756506 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-scripts" (OuterVolumeSpecName: "scripts") pod "53073bd8-b356-4cb8-a190-db417f233b63" (UID: "53073bd8-b356-4cb8-a190-db417f233b63"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.778687 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-config-data" (OuterVolumeSpecName: "config-data") pod "53073bd8-b356-4cb8-a190-db417f233b63" (UID: "53073bd8-b356-4cb8-a190-db417f233b63"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.789589 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53073bd8-b356-4cb8-a190-db417f233b63" (UID: "53073bd8-b356-4cb8-a190-db417f233b63"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.793238 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e06838c2-047e-4746-bb20-735a1eb9cb37" (UID: "e06838c2-047e-4746-bb20-735a1eb9cb37"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.797719 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e06838c2-047e-4746-bb20-735a1eb9cb37" (UID: "e06838c2-047e-4746-bb20-735a1eb9cb37"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.817844 4804 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.817885 4804 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.817900 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcv6c\" (UniqueName: \"kubernetes.io/projected/e06838c2-047e-4746-bb20-735a1eb9cb37-kube-api-access-lcv6c\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.817912 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.817921 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.817933 4804 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.817945 4804 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e06838c2-047e-4746-bb20-735a1eb9cb37-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.817956 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2bf4\" (UniqueName: \"kubernetes.io/projected/53073bd8-b356-4cb8-a190-db417f233b63-kube-api-access-j2bf4\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.817965 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53073bd8-b356-4cb8-a190-db417f233b63-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.817974 4804 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e06838c2-047e-4746-bb20-735a1eb9cb37-logs\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.817986 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.817999 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.818009 4804 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.821150 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-config-data" (OuterVolumeSpecName: "config-data") pod "e06838c2-047e-4746-bb20-735a1eb9cb37" (UID: "e06838c2-047e-4746-bb20-735a1eb9cb37"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.852553 4804 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.919044 4804 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.919076 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e06838c2-047e-4746-bb20-735a1eb9cb37-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.968084 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e06838c2-047e-4746-bb20-735a1eb9cb37","Type":"ContainerDied","Data":"9b93e2a2a156279c02c6d5f10d5f5e53f3649c24aa9d244b7707ada20e287204"} Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.968094 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.968149 4804 scope.go:117] "RemoveContainer" containerID="8c54c0fbf665a0f62da46d5e48fa201c7c07fe926f4bbb23290d21ea751f360e" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.971604 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kmbrx" event={"ID":"53073bd8-b356-4cb8-a190-db417f233b63","Type":"ContainerDied","Data":"22cc18bf0c3204362054a8f4e626573eec2da03d934e88db8cdb3c72fda9d1e5"} Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.971647 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kmbrx" Feb 17 13:46:33 crc kubenswrapper[4804]: I0217 13:46:33.971647 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22cc18bf0c3204362054a8f4e626573eec2da03d934e88db8cdb3c72fda9d1e5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.017763 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.055395 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.068179 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:46:34 crc kubenswrapper[4804]: E0217 13:46:34.068649 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e06838c2-047e-4746-bb20-735a1eb9cb37" containerName="glance-log" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.068672 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="e06838c2-047e-4746-bb20-735a1eb9cb37" containerName="glance-log" Feb 17 13:46:34 crc kubenswrapper[4804]: E0217 13:46:34.068709 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e06838c2-047e-4746-bb20-735a1eb9cb37" containerName="glance-httpd" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.068717 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="e06838c2-047e-4746-bb20-735a1eb9cb37" containerName="glance-httpd" Feb 17 13:46:34 crc kubenswrapper[4804]: E0217 13:46:34.068731 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53073bd8-b356-4cb8-a190-db417f233b63" containerName="keystone-bootstrap" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.068739 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="53073bd8-b356-4cb8-a190-db417f233b63" containerName="keystone-bootstrap" Feb 17 13:46:34 crc kubenswrapper[4804]: E0217 13:46:34.068754 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8639d8d-c367-40a9-b26c-c7c301b82609" containerName="init" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.068761 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8639d8d-c367-40a9-b26c-c7c301b82609" containerName="init" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.068963 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="53073bd8-b356-4cb8-a190-db417f233b63" containerName="keystone-bootstrap" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.068988 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="e06838c2-047e-4746-bb20-735a1eb9cb37" containerName="glance-log" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.069010 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8639d8d-c367-40a9-b26c-c7c301b82609" containerName="init" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.069022 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="e06838c2-047e-4746-bb20-735a1eb9cb37" containerName="glance-httpd" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.070108 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.072761 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.072968 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.096653 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.116346 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-kmbrx"] Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.129857 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-kmbrx"] Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.177892 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-7kgzk"] Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.183383 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.184894 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7kgzk"] Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.185866 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.186121 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.186323 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.186500 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-2fq28" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.186537 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.223651 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.223746 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.223784 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.223812 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-scripts\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.224104 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.224186 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrhpb\" (UniqueName: \"kubernetes.io/projected/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-kube-api-access-rrhpb\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.224324 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-logs\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.224378 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-config-data\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.331542 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-fernet-keys\") pod \"keystone-bootstrap-7kgzk\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.331610 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-credential-keys\") pod \"keystone-bootstrap-7kgzk\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.331653 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.331739 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrhpb\" (UniqueName: \"kubernetes.io/projected/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-kube-api-access-rrhpb\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.331795 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-combined-ca-bundle\") pod \"keystone-bootstrap-7kgzk\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.331820 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-scripts\") pod \"keystone-bootstrap-7kgzk\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.331880 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-logs\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.331946 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-config-data\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.331989 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-config-data\") pod \"keystone-bootstrap-7kgzk\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.332043 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.332105 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.332131 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhl8p\" (UniqueName: \"kubernetes.io/projected/96609ec5-c9e0-4611-85ff-f7dc474d889a-kube-api-access-rhl8p\") pod \"keystone-bootstrap-7kgzk\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.332172 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.332379 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-scripts\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.332647 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-logs\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.332888 4804 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.333412 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.339802 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-config-data\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.340053 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.340286 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-scripts\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.340434 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.360033 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrhpb\" (UniqueName: \"kubernetes.io/projected/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-kube-api-access-rrhpb\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.376115 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-d659f57fc-rp4h6"] Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.377397 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.395580 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.413593 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.434490 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-config-data\") pod \"keystone-bootstrap-7kgzk\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.434886 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhl8p\" (UniqueName: \"kubernetes.io/projected/96609ec5-c9e0-4611-85ff-f7dc474d889a-kube-api-access-rhl8p\") pod \"keystone-bootstrap-7kgzk\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.435006 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-fernet-keys\") pod \"keystone-bootstrap-7kgzk\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.435034 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-credential-keys\") pod \"keystone-bootstrap-7kgzk\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.435094 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-combined-ca-bundle\") pod \"keystone-bootstrap-7kgzk\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.435115 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-scripts\") pod \"keystone-bootstrap-7kgzk\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.440159 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-config-data\") pod \"keystone-bootstrap-7kgzk\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.440405 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-scripts\") pod \"keystone-bootstrap-7kgzk\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.440853 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-fernet-keys\") pod \"keystone-bootstrap-7kgzk\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.441455 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-combined-ca-bundle\") pod \"keystone-bootstrap-7kgzk\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.442129 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-credential-keys\") pod \"keystone-bootstrap-7kgzk\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.451775 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-58989b55cb-zjfvf"] Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.457923 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.474376 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-58989b55cb-zjfvf"] Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.481784 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.489926 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhl8p\" (UniqueName: \"kubernetes.io/projected/96609ec5-c9e0-4611-85ff-f7dc474d889a-kube-api-access-rhl8p\") pod \"keystone-bootstrap-7kgzk\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.496249 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6b56868599-9s4h9"] Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.514175 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.517259 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-9ffb6f5c6-fczv5"] Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.519151 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.537086 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-combined-ca-bundle\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.537175 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-config-data\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.537267 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85415d6a-8a5f-4b65-b182-2bfe221e8eee-scripts\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.537324 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85415d6a-8a5f-4b65-b182-2bfe221e8eee-combined-ca-bundle\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.537351 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/85415d6a-8a5f-4b65-b182-2bfe221e8eee-horizon-secret-key\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.537392 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-logs\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.537411 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9vf9\" (UniqueName: \"kubernetes.io/projected/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-kube-api-access-v9vf9\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.537448 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-horizon-tls-certs\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.537475 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/85415d6a-8a5f-4b65-b182-2bfe221e8eee-horizon-tls-certs\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.537498 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85415d6a-8a5f-4b65-b182-2bfe221e8eee-config-data\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.537544 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wfg7\" (UniqueName: \"kubernetes.io/projected/85415d6a-8a5f-4b65-b182-2bfe221e8eee-kube-api-access-6wfg7\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.537582 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-scripts\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.537613 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-horizon-secret-key\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.537737 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85415d6a-8a5f-4b65-b182-2bfe221e8eee-logs\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.545398 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9ffb6f5c6-fczv5"] Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.594097 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53073bd8-b356-4cb8-a190-db417f233b63" path="/var/lib/kubelet/pods/53073bd8-b356-4cb8-a190-db417f233b63/volumes" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.594995 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e06838c2-047e-4746-bb20-735a1eb9cb37" path="/var/lib/kubelet/pods/e06838c2-047e-4746-bb20-735a1eb9cb37/volumes" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.639257 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/85415d6a-8a5f-4b65-b182-2bfe221e8eee-horizon-tls-certs\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.639361 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85415d6a-8a5f-4b65-b182-2bfe221e8eee-config-data\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.639407 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wfg7\" (UniqueName: \"kubernetes.io/projected/85415d6a-8a5f-4b65-b182-2bfe221e8eee-kube-api-access-6wfg7\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.639451 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-scripts\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.639507 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-horizon-secret-key\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.639653 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85415d6a-8a5f-4b65-b182-2bfe221e8eee-logs\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.639747 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-combined-ca-bundle\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.639811 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85415d6a-8a5f-4b65-b182-2bfe221e8eee-scripts\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.639841 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-config-data\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.639869 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85415d6a-8a5f-4b65-b182-2bfe221e8eee-combined-ca-bundle\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.639935 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/85415d6a-8a5f-4b65-b182-2bfe221e8eee-horizon-secret-key\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.640062 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-logs\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.640102 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9vf9\" (UniqueName: \"kubernetes.io/projected/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-kube-api-access-v9vf9\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.640169 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-horizon-tls-certs\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.640984 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-logs\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.641613 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-scripts\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.642461 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85415d6a-8a5f-4b65-b182-2bfe221e8eee-logs\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.643848 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-config-data\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.644321 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85415d6a-8a5f-4b65-b182-2bfe221e8eee-scripts\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.646317 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85415d6a-8a5f-4b65-b182-2bfe221e8eee-config-data\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.659499 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-combined-ca-bundle\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.659552 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/85415d6a-8a5f-4b65-b182-2bfe221e8eee-horizon-secret-key\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.659591 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85415d6a-8a5f-4b65-b182-2bfe221e8eee-combined-ca-bundle\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.659557 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/85415d6a-8a5f-4b65-b182-2bfe221e8eee-horizon-tls-certs\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.659893 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-horizon-tls-certs\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.660463 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-horizon-secret-key\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.662480 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wfg7\" (UniqueName: \"kubernetes.io/projected/85415d6a-8a5f-4b65-b182-2bfe221e8eee-kube-api-access-6wfg7\") pod \"horizon-58989b55cb-zjfvf\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.664455 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9vf9\" (UniqueName: \"kubernetes.io/projected/e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f-kube-api-access-v9vf9\") pod \"horizon-9ffb6f5c6-fczv5\" (UID: \"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f\") " pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.869628 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:46:34 crc kubenswrapper[4804]: I0217 13:46:34.878140 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:46:36 crc kubenswrapper[4804]: I0217 13:46:36.032584 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:46:36 crc kubenswrapper[4804]: I0217 13:46:36.106608 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-hp2db"] Feb 17 13:46:36 crc kubenswrapper[4804]: I0217 13:46:36.106904 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" podUID="d29250cb-6c2b-4994-ba6f-f3b7239ec3e2" containerName="dnsmasq-dns" containerID="cri-o://d82f8e60d688c7c01688fbedc29bdcd643db8c569309612415da050dc9220d5f" gracePeriod=10 Feb 17 13:46:37 crc kubenswrapper[4804]: I0217 13:46:37.008959 4804 generic.go:334] "Generic (PLEG): container finished" podID="d29250cb-6c2b-4994-ba6f-f3b7239ec3e2" containerID="d82f8e60d688c7c01688fbedc29bdcd643db8c569309612415da050dc9220d5f" exitCode=0 Feb 17 13:46:37 crc kubenswrapper[4804]: I0217 13:46:37.009070 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" event={"ID":"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2","Type":"ContainerDied","Data":"d82f8e60d688c7c01688fbedc29bdcd643db8c569309612415da050dc9220d5f"} Feb 17 13:46:39 crc kubenswrapper[4804]: I0217 13:46:39.169937 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" podUID="d29250cb-6c2b-4994-ba6f-f3b7239ec3e2" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: connect: connection refused" Feb 17 13:46:44 crc kubenswrapper[4804]: I0217 13:46:44.169941 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" podUID="d29250cb-6c2b-4994-ba6f-f3b7239ec3e2" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: connect: connection refused" Feb 17 13:46:45 crc kubenswrapper[4804]: E0217 13:46:45.675447 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Feb 17 13:46:45 crc kubenswrapper[4804]: E0217 13:46:45.676336 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n76h57h5b7hf6h685h68ch58ch597h99h65fh5c9hcbh58dh66chb5hf6h6dh654h648hch589h66fhf7h5d7hb6h7dh5f8h78h598h559h644h56bq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-42bnj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-d659f57fc-rp4h6_openstack(64f8b969-63f0-4c36-baa9-e86e4b0bf0d9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 13:46:45 crc kubenswrapper[4804]: E0217 13:46:45.680070 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-d659f57fc-rp4h6" podUID="64f8b969-63f0-4c36-baa9-e86e4b0bf0d9" Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.108491 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ae662df3-8898-4509-b820-2a918ad3ad7a","Type":"ContainerDied","Data":"49f64571d9f610637811cf86586750a6d15c78928db1891dc4609e905bc4b08c"} Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.108553 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49f64571d9f610637811cf86586750a6d15c78928db1891dc4609e905bc4b08c" Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.157186 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.311225 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-config-data\") pod \"ae662df3-8898-4509-b820-2a918ad3ad7a\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.311339 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ae662df3-8898-4509-b820-2a918ad3ad7a-httpd-run\") pod \"ae662df3-8898-4509-b820-2a918ad3ad7a\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.311423 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-combined-ca-bundle\") pod \"ae662df3-8898-4509-b820-2a918ad3ad7a\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.311455 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-internal-tls-certs\") pod \"ae662df3-8898-4509-b820-2a918ad3ad7a\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.311505 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-scripts\") pod \"ae662df3-8898-4509-b820-2a918ad3ad7a\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.311529 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ae662df3-8898-4509-b820-2a918ad3ad7a\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.311606 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9lw6\" (UniqueName: \"kubernetes.io/projected/ae662df3-8898-4509-b820-2a918ad3ad7a-kube-api-access-f9lw6\") pod \"ae662df3-8898-4509-b820-2a918ad3ad7a\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.311686 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae662df3-8898-4509-b820-2a918ad3ad7a-logs\") pod \"ae662df3-8898-4509-b820-2a918ad3ad7a\" (UID: \"ae662df3-8898-4509-b820-2a918ad3ad7a\") " Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.312045 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae662df3-8898-4509-b820-2a918ad3ad7a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ae662df3-8898-4509-b820-2a918ad3ad7a" (UID: "ae662df3-8898-4509-b820-2a918ad3ad7a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.312773 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae662df3-8898-4509-b820-2a918ad3ad7a-logs" (OuterVolumeSpecName: "logs") pod "ae662df3-8898-4509-b820-2a918ad3ad7a" (UID: "ae662df3-8898-4509-b820-2a918ad3ad7a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.319935 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-scripts" (OuterVolumeSpecName: "scripts") pod "ae662df3-8898-4509-b820-2a918ad3ad7a" (UID: "ae662df3-8898-4509-b820-2a918ad3ad7a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.320023 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "ae662df3-8898-4509-b820-2a918ad3ad7a" (UID: "ae662df3-8898-4509-b820-2a918ad3ad7a"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.344886 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae662df3-8898-4509-b820-2a918ad3ad7a" (UID: "ae662df3-8898-4509-b820-2a918ad3ad7a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.347480 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae662df3-8898-4509-b820-2a918ad3ad7a-kube-api-access-f9lw6" (OuterVolumeSpecName: "kube-api-access-f9lw6") pod "ae662df3-8898-4509-b820-2a918ad3ad7a" (UID: "ae662df3-8898-4509-b820-2a918ad3ad7a"). InnerVolumeSpecName "kube-api-access-f9lw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.361963 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ae662df3-8898-4509-b820-2a918ad3ad7a" (UID: "ae662df3-8898-4509-b820-2a918ad3ad7a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.364526 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-config-data" (OuterVolumeSpecName: "config-data") pod "ae662df3-8898-4509-b820-2a918ad3ad7a" (UID: "ae662df3-8898-4509-b820-2a918ad3ad7a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.417859 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.417910 4804 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.417923 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.417971 4804 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.417987 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9lw6\" (UniqueName: \"kubernetes.io/projected/ae662df3-8898-4509-b820-2a918ad3ad7a-kube-api-access-f9lw6\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.418002 4804 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae662df3-8898-4509-b820-2a918ad3ad7a-logs\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.418013 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae662df3-8898-4509-b820-2a918ad3ad7a-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.418025 4804 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ae662df3-8898-4509-b820-2a918ad3ad7a-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.447373 4804 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Feb 17 13:46:46 crc kubenswrapper[4804]: E0217 13:46:46.475453 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Feb 17 13:46:46 crc kubenswrapper[4804]: E0217 13:46:46.475667 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v577c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-jz9x9_openstack(19dd0c13-b898-4147-ae5f-cbc5d4915910): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 13:46:46 crc kubenswrapper[4804]: E0217 13:46:46.476939 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-jz9x9" podUID="19dd0c13-b898-4147-ae5f-cbc5d4915910" Feb 17 13:46:46 crc kubenswrapper[4804]: I0217 13:46:46.519490 4804 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.127241 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: E0217 13:46:47.129335 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-jz9x9" podUID="19dd0c13-b898-4147-ae5f-cbc5d4915910" Feb 17 13:46:47 crc kubenswrapper[4804]: E0217 13:46:47.139768 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Feb 17 13:46:47 crc kubenswrapper[4804]: E0217 13:46:47.143583 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n547h685h5c7h66fh5b5h7fhcfh689h59bh5ffh658h54bh8bh5cdh68h595hb6h5cfh57fhc9hfdh648h5f6h5d6h554h85h589h67dh5c7h5fbh5b7hdcq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8jjjd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(e5ccd477-88cd-4284-9de7-f336def1c7a1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 13:46:47 crc kubenswrapper[4804]: E0217 13:46:47.148710 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Feb 17 13:46:47 crc kubenswrapper[4804]: E0217 13:46:47.148877 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n698h57dh657h585h7fh5d5h677h685hcbh86h5fdh75h5b6h689h86hd6hf5h545h55bhffh674hf6h574h5c8h79h68bhffh8dh56h555h699h656q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-45pwj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6b56868599-9s4h9_openstack(c40906b1-b78c-4e65-9c35-346626adeba3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 13:46:47 crc kubenswrapper[4804]: E0217 13:46:47.150815 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-6b56868599-9s4h9" podUID="c40906b1-b78c-4e65-9c35-346626adeba3" Feb 17 13:46:47 crc kubenswrapper[4804]: E0217 13:46:47.185783 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Feb 17 13:46:47 crc kubenswrapper[4804]: E0217 13:46:47.185989 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5cch54ch549h6bh85h58ch68bhd6hc8hc7h5ffh649h5b8h5d5h647h7fh74h54h56dhb7h55chb6hc4h68fh5b6hch5c6h694h86h65dh5d5hcdq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f28sx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-5c54d4859c-6cf2w_openstack(429f2d90-393a-4205-9597-4a1d92dd15be): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 13:46:47 crc kubenswrapper[4804]: E0217 13:46:47.188355 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-5c54d4859c-6cf2w" podUID="429f2d90-393a-4205-9597-4a1d92dd15be" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.232679 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.243096 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.256010 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 13:46:47 crc kubenswrapper[4804]: E0217 13:46:47.256402 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae662df3-8898-4509-b820-2a918ad3ad7a" containerName="glance-log" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.256416 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae662df3-8898-4509-b820-2a918ad3ad7a" containerName="glance-log" Feb 17 13:46:47 crc kubenswrapper[4804]: E0217 13:46:47.256439 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae662df3-8898-4509-b820-2a918ad3ad7a" containerName="glance-httpd" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.256445 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae662df3-8898-4509-b820-2a918ad3ad7a" containerName="glance-httpd" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.256611 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae662df3-8898-4509-b820-2a918ad3ad7a" containerName="glance-httpd" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.256633 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae662df3-8898-4509-b820-2a918ad3ad7a" containerName="glance-log" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.257547 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.262285 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.262331 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.376239 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.442065 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scvw2\" (UniqueName: \"kubernetes.io/projected/8ec519a7-9081-4341-ad6c-c81dda70bd3a-kube-api-access-scvw2\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.442251 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.442278 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ec519a7-9081-4341-ad6c-c81dda70bd3a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.442397 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.442510 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ec519a7-9081-4341-ad6c-c81dda70bd3a-logs\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.442542 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.442570 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.442668 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.544353 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ec519a7-9081-4341-ad6c-c81dda70bd3a-logs\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.544412 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.544445 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.544520 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.544552 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scvw2\" (UniqueName: \"kubernetes.io/projected/8ec519a7-9081-4341-ad6c-c81dda70bd3a-kube-api-access-scvw2\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.544601 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.544621 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ec519a7-9081-4341-ad6c-c81dda70bd3a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.544660 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.544926 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ec519a7-9081-4341-ad6c-c81dda70bd3a-logs\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.545692 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ec519a7-9081-4341-ad6c-c81dda70bd3a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.545914 4804 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.550814 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.559351 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.564544 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.565683 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.574249 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scvw2\" (UniqueName: \"kubernetes.io/projected/8ec519a7-9081-4341-ad6c-c81dda70bd3a-kube-api-access-scvw2\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.582949 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:46:47 crc kubenswrapper[4804]: I0217 13:46:47.591360 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 13:46:48 crc kubenswrapper[4804]: I0217 13:46:48.588947 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae662df3-8898-4509-b820-2a918ad3ad7a" path="/var/lib/kubelet/pods/ae662df3-8898-4509-b820-2a918ad3ad7a/volumes" Feb 17 13:46:51 crc kubenswrapper[4804]: I0217 13:46:51.159683 4804 generic.go:334] "Generic (PLEG): container finished" podID="f15102ce-82ca-49c8-a069-25469380b043" containerID="4334f8c8c165dce79cf685c7b7ada0d4aa970effa853bf86402b0c64eaa765f2" exitCode=0 Feb 17 13:46:51 crc kubenswrapper[4804]: I0217 13:46:51.159771 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jltn7" event={"ID":"f15102ce-82ca-49c8-a069-25469380b043","Type":"ContainerDied","Data":"4334f8c8c165dce79cf685c7b7ada0d4aa970effa853bf86402b0c64eaa765f2"} Feb 17 13:46:54 crc kubenswrapper[4804]: I0217 13:46:54.169427 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" podUID="d29250cb-6c2b-4994-ba6f-f3b7239ec3e2" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: i/o timeout" Feb 17 13:46:54 crc kubenswrapper[4804]: I0217 13:46:54.170512 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.009902 4804 scope.go:117] "RemoveContainer" containerID="08670815fbd1e9b758c348fb93fa962c48c717b87693dc123c304e7d0ec4d4cd" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.109240 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-d659f57fc-rp4h6" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.122184 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.132373 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c54d4859c-6cf2w" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.146989 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b56868599-9s4h9" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.162323 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jltn7" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.203686 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b56868599-9s4h9" event={"ID":"c40906b1-b78c-4e65-9c35-346626adeba3","Type":"ContainerDied","Data":"0364542f4068f261d045963e550e809dad4e8639c09b095d8e25392e8b58c1ac"} Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.203777 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b56868599-9s4h9" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.204693 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-logs\") pod \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\" (UID: \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.204725 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42bnj\" (UniqueName: \"kubernetes.io/projected/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-kube-api-access-42bnj\") pod \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\" (UID: \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.204784 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-scripts\") pod \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\" (UID: \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.204850 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-horizon-secret-key\") pod \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\" (UID: \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.204876 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-config-data\") pod \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\" (UID: \"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.206020 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-config-data" (OuterVolumeSpecName: "config-data") pod "64f8b969-63f0-4c36-baa9-e86e4b0bf0d9" (UID: "64f8b969-63f0-4c36-baa9-e86e4b0bf0d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.206005 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-scripts" (OuterVolumeSpecName: "scripts") pod "64f8b969-63f0-4c36-baa9-e86e4b0bf0d9" (UID: "64f8b969-63f0-4c36-baa9-e86e4b0bf0d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.206134 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-logs" (OuterVolumeSpecName: "logs") pod "64f8b969-63f0-4c36-baa9-e86e4b0bf0d9" (UID: "64f8b969-63f0-4c36-baa9-e86e4b0bf0d9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.210317 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c54d4859c-6cf2w" event={"ID":"429f2d90-393a-4205-9597-4a1d92dd15be","Type":"ContainerDied","Data":"fce38230f6cfdb1a15c8055882ba0aa717f46a7acf9c7b94e71d76d6f0c3a6ff"} Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.210476 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c54d4859c-6cf2w" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.213438 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "64f8b969-63f0-4c36-baa9-e86e4b0bf0d9" (UID: "64f8b969-63f0-4c36-baa9-e86e4b0bf0d9"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.216569 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-kube-api-access-42bnj" (OuterVolumeSpecName: "kube-api-access-42bnj") pod "64f8b969-63f0-4c36-baa9-e86e4b0bf0d9" (UID: "64f8b969-63f0-4c36-baa9-e86e4b0bf0d9"). InnerVolumeSpecName "kube-api-access-42bnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.221859 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-d659f57fc-rp4h6" event={"ID":"64f8b969-63f0-4c36-baa9-e86e4b0bf0d9","Type":"ContainerDied","Data":"38398873d4e244754b25fb3ccf4b8d269e3c191649fc54d1a45b9951042bdc7f"} Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.222074 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-d659f57fc-rp4h6" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.235364 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jltn7" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.235924 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jltn7" event={"ID":"f15102ce-82ca-49c8-a069-25469380b043","Type":"ContainerDied","Data":"87611d75a6337c3c8516a80920686721f7c8d0bd90cc4c5bcd9d5239128f8f14"} Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.235968 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87611d75a6337c3c8516a80920686721f7c8d0bd90cc4c5bcd9d5239128f8f14" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.239667 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" event={"ID":"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2","Type":"ContainerDied","Data":"9e9b5616dd62b1afbb31c7b84604c193960d32aba2443b3713adaf3e69d9332f"} Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.239782 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.306252 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58rdh\" (UniqueName: \"kubernetes.io/projected/f15102ce-82ca-49c8-a069-25469380b043-kube-api-access-58rdh\") pod \"f15102ce-82ca-49c8-a069-25469380b043\" (UID: \"f15102ce-82ca-49c8-a069-25469380b043\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.306315 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/429f2d90-393a-4205-9597-4a1d92dd15be-horizon-secret-key\") pod \"429f2d90-393a-4205-9597-4a1d92dd15be\" (UID: \"429f2d90-393a-4205-9597-4a1d92dd15be\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.306339 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f28sx\" (UniqueName: \"kubernetes.io/projected/429f2d90-393a-4205-9597-4a1d92dd15be-kube-api-access-f28sx\") pod \"429f2d90-393a-4205-9597-4a1d92dd15be\" (UID: \"429f2d90-393a-4205-9597-4a1d92dd15be\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.306413 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gg5rm\" (UniqueName: \"kubernetes.io/projected/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-kube-api-access-gg5rm\") pod \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.306446 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/429f2d90-393a-4205-9597-4a1d92dd15be-scripts\") pod \"429f2d90-393a-4205-9597-4a1d92dd15be\" (UID: \"429f2d90-393a-4205-9597-4a1d92dd15be\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.306492 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-ovsdbserver-nb\") pod \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.306532 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-config\") pod \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.306553 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c40906b1-b78c-4e65-9c35-346626adeba3-logs\") pod \"c40906b1-b78c-4e65-9c35-346626adeba3\" (UID: \"c40906b1-b78c-4e65-9c35-346626adeba3\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.306581 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45pwj\" (UniqueName: \"kubernetes.io/projected/c40906b1-b78c-4e65-9c35-346626adeba3-kube-api-access-45pwj\") pod \"c40906b1-b78c-4e65-9c35-346626adeba3\" (UID: \"c40906b1-b78c-4e65-9c35-346626adeba3\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.306611 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c40906b1-b78c-4e65-9c35-346626adeba3-scripts\") pod \"c40906b1-b78c-4e65-9c35-346626adeba3\" (UID: \"c40906b1-b78c-4e65-9c35-346626adeba3\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.306643 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/429f2d90-393a-4205-9597-4a1d92dd15be-logs\") pod \"429f2d90-393a-4205-9597-4a1d92dd15be\" (UID: \"429f2d90-393a-4205-9597-4a1d92dd15be\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.306661 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c40906b1-b78c-4e65-9c35-346626adeba3-config-data\") pod \"c40906b1-b78c-4e65-9c35-346626adeba3\" (UID: \"c40906b1-b78c-4e65-9c35-346626adeba3\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.306689 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-dns-svc\") pod \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.306723 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-dns-swift-storage-0\") pod \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.306748 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-ovsdbserver-sb\") pod \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\" (UID: \"d29250cb-6c2b-4994-ba6f-f3b7239ec3e2\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.306791 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f15102ce-82ca-49c8-a069-25469380b043-combined-ca-bundle\") pod \"f15102ce-82ca-49c8-a069-25469380b043\" (UID: \"f15102ce-82ca-49c8-a069-25469380b043\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.306815 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f15102ce-82ca-49c8-a069-25469380b043-config\") pod \"f15102ce-82ca-49c8-a069-25469380b043\" (UID: \"f15102ce-82ca-49c8-a069-25469380b043\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.306835 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c40906b1-b78c-4e65-9c35-346626adeba3-horizon-secret-key\") pod \"c40906b1-b78c-4e65-9c35-346626adeba3\" (UID: \"c40906b1-b78c-4e65-9c35-346626adeba3\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.306879 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/429f2d90-393a-4205-9597-4a1d92dd15be-config-data\") pod \"429f2d90-393a-4205-9597-4a1d92dd15be\" (UID: \"429f2d90-393a-4205-9597-4a1d92dd15be\") " Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.308989 4804 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-logs\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.309019 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42bnj\" (UniqueName: \"kubernetes.io/projected/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-kube-api-access-42bnj\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.309032 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.309043 4804 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.309056 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.309891 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c40906b1-b78c-4e65-9c35-346626adeba3-scripts" (OuterVolumeSpecName: "scripts") pod "c40906b1-b78c-4e65-9c35-346626adeba3" (UID: "c40906b1-b78c-4e65-9c35-346626adeba3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.309996 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/429f2d90-393a-4205-9597-4a1d92dd15be-scripts" (OuterVolumeSpecName: "scripts") pod "429f2d90-393a-4205-9597-4a1d92dd15be" (UID: "429f2d90-393a-4205-9597-4a1d92dd15be"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.310642 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/429f2d90-393a-4205-9597-4a1d92dd15be-logs" (OuterVolumeSpecName: "logs") pod "429f2d90-393a-4205-9597-4a1d92dd15be" (UID: "429f2d90-393a-4205-9597-4a1d92dd15be"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.311494 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c40906b1-b78c-4e65-9c35-346626adeba3-logs" (OuterVolumeSpecName: "logs") pod "c40906b1-b78c-4e65-9c35-346626adeba3" (UID: "c40906b1-b78c-4e65-9c35-346626adeba3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.311841 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/429f2d90-393a-4205-9597-4a1d92dd15be-config-data" (OuterVolumeSpecName: "config-data") pod "429f2d90-393a-4205-9597-4a1d92dd15be" (UID: "429f2d90-393a-4205-9597-4a1d92dd15be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.314335 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/429f2d90-393a-4205-9597-4a1d92dd15be-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "429f2d90-393a-4205-9597-4a1d92dd15be" (UID: "429f2d90-393a-4205-9597-4a1d92dd15be"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.314636 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-kube-api-access-gg5rm" (OuterVolumeSpecName: "kube-api-access-gg5rm") pod "d29250cb-6c2b-4994-ba6f-f3b7239ec3e2" (UID: "d29250cb-6c2b-4994-ba6f-f3b7239ec3e2"). InnerVolumeSpecName "kube-api-access-gg5rm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.314876 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/429f2d90-393a-4205-9597-4a1d92dd15be-kube-api-access-f28sx" (OuterVolumeSpecName: "kube-api-access-f28sx") pod "429f2d90-393a-4205-9597-4a1d92dd15be" (UID: "429f2d90-393a-4205-9597-4a1d92dd15be"). InnerVolumeSpecName "kube-api-access-f28sx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.315384 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f15102ce-82ca-49c8-a069-25469380b043-kube-api-access-58rdh" (OuterVolumeSpecName: "kube-api-access-58rdh") pod "f15102ce-82ca-49c8-a069-25469380b043" (UID: "f15102ce-82ca-49c8-a069-25469380b043"). InnerVolumeSpecName "kube-api-access-58rdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.321708 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c40906b1-b78c-4e65-9c35-346626adeba3-config-data" (OuterVolumeSpecName: "config-data") pod "c40906b1-b78c-4e65-9c35-346626adeba3" (UID: "c40906b1-b78c-4e65-9c35-346626adeba3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.325590 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c40906b1-b78c-4e65-9c35-346626adeba3-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "c40906b1-b78c-4e65-9c35-346626adeba3" (UID: "c40906b1-b78c-4e65-9c35-346626adeba3"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.353587 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c40906b1-b78c-4e65-9c35-346626adeba3-kube-api-access-45pwj" (OuterVolumeSpecName: "kube-api-access-45pwj") pod "c40906b1-b78c-4e65-9c35-346626adeba3" (UID: "c40906b1-b78c-4e65-9c35-346626adeba3"). InnerVolumeSpecName "kube-api-access-45pwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.355594 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-d659f57fc-rp4h6"] Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.367903 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f15102ce-82ca-49c8-a069-25469380b043-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f15102ce-82ca-49c8-a069-25469380b043" (UID: "f15102ce-82ca-49c8-a069-25469380b043"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.372833 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-d659f57fc-rp4h6"] Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.384254 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d29250cb-6c2b-4994-ba6f-f3b7239ec3e2" (UID: "d29250cb-6c2b-4994-ba6f-f3b7239ec3e2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.385027 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f15102ce-82ca-49c8-a069-25469380b043-config" (OuterVolumeSpecName: "config") pod "f15102ce-82ca-49c8-a069-25469380b043" (UID: "f15102ce-82ca-49c8-a069-25469380b043"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.391344 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d29250cb-6c2b-4994-ba6f-f3b7239ec3e2" (UID: "d29250cb-6c2b-4994-ba6f-f3b7239ec3e2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.394636 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d29250cb-6c2b-4994-ba6f-f3b7239ec3e2" (UID: "d29250cb-6c2b-4994-ba6f-f3b7239ec3e2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.396883 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-config" (OuterVolumeSpecName: "config") pod "d29250cb-6c2b-4994-ba6f-f3b7239ec3e2" (UID: "d29250cb-6c2b-4994-ba6f-f3b7239ec3e2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.397957 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d29250cb-6c2b-4994-ba6f-f3b7239ec3e2" (UID: "d29250cb-6c2b-4994-ba6f-f3b7239ec3e2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.410913 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/429f2d90-393a-4205-9597-4a1d92dd15be-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.410969 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58rdh\" (UniqueName: \"kubernetes.io/projected/f15102ce-82ca-49c8-a069-25469380b043-kube-api-access-58rdh\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.410984 4804 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/429f2d90-393a-4205-9597-4a1d92dd15be-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.410995 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f28sx\" (UniqueName: \"kubernetes.io/projected/429f2d90-393a-4205-9597-4a1d92dd15be-kube-api-access-f28sx\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.411007 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gg5rm\" (UniqueName: \"kubernetes.io/projected/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-kube-api-access-gg5rm\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.411019 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/429f2d90-393a-4205-9597-4a1d92dd15be-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.411029 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.411040 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.411050 4804 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c40906b1-b78c-4e65-9c35-346626adeba3-logs\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.411059 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45pwj\" (UniqueName: \"kubernetes.io/projected/c40906b1-b78c-4e65-9c35-346626adeba3-kube-api-access-45pwj\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.411069 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c40906b1-b78c-4e65-9c35-346626adeba3-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.411077 4804 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/429f2d90-393a-4205-9597-4a1d92dd15be-logs\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.411084 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c40906b1-b78c-4e65-9c35-346626adeba3-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.411092 4804 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.411099 4804 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.411109 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.411116 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f15102ce-82ca-49c8-a069-25469380b043-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.411152 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f15102ce-82ca-49c8-a069-25469380b043-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.411160 4804 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c40906b1-b78c-4e65-9c35-346626adeba3-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.613740 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6b56868599-9s4h9"] Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.635645 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6b56868599-9s4h9"] Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.656258 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5c54d4859c-6cf2w"] Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.662180 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5c54d4859c-6cf2w"] Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.667736 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-hp2db"] Feb 17 13:46:55 crc kubenswrapper[4804]: I0217 13:46:55.673046 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-hp2db"] Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.494537 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-k5l98"] Feb 17 13:46:56 crc kubenswrapper[4804]: E0217 13:46:56.496117 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d29250cb-6c2b-4994-ba6f-f3b7239ec3e2" containerName="init" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.496140 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="d29250cb-6c2b-4994-ba6f-f3b7239ec3e2" containerName="init" Feb 17 13:46:56 crc kubenswrapper[4804]: E0217 13:46:56.496270 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f15102ce-82ca-49c8-a069-25469380b043" containerName="neutron-db-sync" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.496745 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="f15102ce-82ca-49c8-a069-25469380b043" containerName="neutron-db-sync" Feb 17 13:46:56 crc kubenswrapper[4804]: E0217 13:46:56.496808 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d29250cb-6c2b-4994-ba6f-f3b7239ec3e2" containerName="dnsmasq-dns" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.496821 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="d29250cb-6c2b-4994-ba6f-f3b7239ec3e2" containerName="dnsmasq-dns" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.497188 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="f15102ce-82ca-49c8-a069-25469380b043" containerName="neutron-db-sync" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.497233 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="d29250cb-6c2b-4994-ba6f-f3b7239ec3e2" containerName="dnsmasq-dns" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.498432 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.517186 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-k5l98"] Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.585892 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="429f2d90-393a-4205-9597-4a1d92dd15be" path="/var/lib/kubelet/pods/429f2d90-393a-4205-9597-4a1d92dd15be/volumes" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.586466 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64f8b969-63f0-4c36-baa9-e86e4b0bf0d9" path="/var/lib/kubelet/pods/64f8b969-63f0-4c36-baa9-e86e4b0bf0d9/volumes" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.587020 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c40906b1-b78c-4e65-9c35-346626adeba3" path="/var/lib/kubelet/pods/c40906b1-b78c-4e65-9c35-346626adeba3/volumes" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.587536 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d29250cb-6c2b-4994-ba6f-f3b7239ec3e2" path="/var/lib/kubelet/pods/d29250cb-6c2b-4994-ba6f-f3b7239ec3e2/volumes" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.629167 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-547f989fd6-rqkvc"] Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.630559 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.632926 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.633911 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.634078 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.635195 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-mckmx" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.640630 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-547f989fd6-rqkvc"] Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.646710 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-k5l98\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.646759 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-k5l98\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.646823 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr4tf\" (UniqueName: \"kubernetes.io/projected/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-kube-api-access-gr4tf\") pod \"dnsmasq-dns-6b7b667979-k5l98\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.646855 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-dns-svc\") pod \"dnsmasq-dns-6b7b667979-k5l98\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.646908 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-k5l98\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.646937 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-config\") pod \"dnsmasq-dns-6b7b667979-k5l98\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.748224 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-k5l98\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.748507 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-k5l98\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.748559 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x29pc\" (UniqueName: \"kubernetes.io/projected/a2f2352e-7e9b-439f-be3c-b48b70681658-kube-api-access-x29pc\") pod \"neutron-547f989fd6-rqkvc\" (UID: \"a2f2352e-7e9b-439f-be3c-b48b70681658\") " pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.748591 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-config\") pod \"neutron-547f989fd6-rqkvc\" (UID: \"a2f2352e-7e9b-439f-be3c-b48b70681658\") " pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.748622 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr4tf\" (UniqueName: \"kubernetes.io/projected/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-kube-api-access-gr4tf\") pod \"dnsmasq-dns-6b7b667979-k5l98\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.748662 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-dns-svc\") pod \"dnsmasq-dns-6b7b667979-k5l98\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.748736 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-k5l98\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.748767 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-ovndb-tls-certs\") pod \"neutron-547f989fd6-rqkvc\" (UID: \"a2f2352e-7e9b-439f-be3c-b48b70681658\") " pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.748793 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-config\") pod \"dnsmasq-dns-6b7b667979-k5l98\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.748827 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-httpd-config\") pod \"neutron-547f989fd6-rqkvc\" (UID: \"a2f2352e-7e9b-439f-be3c-b48b70681658\") " pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.748842 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-combined-ca-bundle\") pod \"neutron-547f989fd6-rqkvc\" (UID: \"a2f2352e-7e9b-439f-be3c-b48b70681658\") " pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.749500 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-k5l98\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.750116 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-config\") pod \"dnsmasq-dns-6b7b667979-k5l98\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.750833 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-k5l98\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.751181 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-dns-svc\") pod \"dnsmasq-dns-6b7b667979-k5l98\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.751611 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-k5l98\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.772099 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr4tf\" (UniqueName: \"kubernetes.io/projected/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-kube-api-access-gr4tf\") pod \"dnsmasq-dns-6b7b667979-k5l98\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.850560 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-httpd-config\") pod \"neutron-547f989fd6-rqkvc\" (UID: \"a2f2352e-7e9b-439f-be3c-b48b70681658\") " pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.850611 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-combined-ca-bundle\") pod \"neutron-547f989fd6-rqkvc\" (UID: \"a2f2352e-7e9b-439f-be3c-b48b70681658\") " pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.850684 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x29pc\" (UniqueName: \"kubernetes.io/projected/a2f2352e-7e9b-439f-be3c-b48b70681658-kube-api-access-x29pc\") pod \"neutron-547f989fd6-rqkvc\" (UID: \"a2f2352e-7e9b-439f-be3c-b48b70681658\") " pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.850714 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-config\") pod \"neutron-547f989fd6-rqkvc\" (UID: \"a2f2352e-7e9b-439f-be3c-b48b70681658\") " pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.850805 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-ovndb-tls-certs\") pod \"neutron-547f989fd6-rqkvc\" (UID: \"a2f2352e-7e9b-439f-be3c-b48b70681658\") " pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.854441 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-httpd-config\") pod \"neutron-547f989fd6-rqkvc\" (UID: \"a2f2352e-7e9b-439f-be3c-b48b70681658\") " pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.854466 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-ovndb-tls-certs\") pod \"neutron-547f989fd6-rqkvc\" (UID: \"a2f2352e-7e9b-439f-be3c-b48b70681658\") " pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.854807 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-combined-ca-bundle\") pod \"neutron-547f989fd6-rqkvc\" (UID: \"a2f2352e-7e9b-439f-be3c-b48b70681658\") " pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.855086 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-config\") pod \"neutron-547f989fd6-rqkvc\" (UID: \"a2f2352e-7e9b-439f-be3c-b48b70681658\") " pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.860570 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.869620 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x29pc\" (UniqueName: \"kubernetes.io/projected/a2f2352e-7e9b-439f-be3c-b48b70681658-kube-api-access-x29pc\") pod \"neutron-547f989fd6-rqkvc\" (UID: \"a2f2352e-7e9b-439f-be3c-b48b70681658\") " pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:46:56 crc kubenswrapper[4804]: I0217 13:46:56.955161 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:46:57 crc kubenswrapper[4804]: E0217 13:46:57.009572 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Feb 17 13:46:57 crc kubenswrapper[4804]: E0217 13:46:57.009727 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-trmx2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-f9zkj_openstack(02a921c8-6579-451b-beaf-9832cf900668): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 13:46:57 crc kubenswrapper[4804]: E0217 13:46:57.010947 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-f9zkj" podUID="02a921c8-6579-451b-beaf-9832cf900668" Feb 17 13:46:57 crc kubenswrapper[4804]: E0217 13:46:57.274800 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-f9zkj" podUID="02a921c8-6579-451b-beaf-9832cf900668" Feb 17 13:46:57 crc kubenswrapper[4804]: I0217 13:46:57.389359 4804 scope.go:117] "RemoveContainer" containerID="d82f8e60d688c7c01688fbedc29bdcd643db8c569309612415da050dc9220d5f" Feb 17 13:46:57 crc kubenswrapper[4804]: I0217 13:46:57.540932 4804 scope.go:117] "RemoveContainer" containerID="db5a3b86c0d8b3db5d6271f9217c22ad17bdcc258a7e248a0fd7a959c200bb06" Feb 17 13:46:57 crc kubenswrapper[4804]: I0217 13:46:57.620641 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:46:57 crc kubenswrapper[4804]: I0217 13:46:57.813392 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7kgzk"] Feb 17 13:46:57 crc kubenswrapper[4804]: W0217 13:46:57.826615 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96609ec5_c9e0_4611_85ff_f7dc474d889a.slice/crio-7d54b7b327f8ebe87ac5109269b506e672be06f69b6bfb3774f552c367900af0 WatchSource:0}: Error finding container 7d54b7b327f8ebe87ac5109269b506e672be06f69b6bfb3774f552c367900af0: Status 404 returned error can't find the container with id 7d54b7b327f8ebe87ac5109269b506e672be06f69b6bfb3774f552c367900af0 Feb 17 13:46:57 crc kubenswrapper[4804]: I0217 13:46:57.942406 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-58989b55cb-zjfvf"] Feb 17 13:46:57 crc kubenswrapper[4804]: I0217 13:46:57.987012 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9ffb6f5c6-fczv5"] Feb 17 13:46:57 crc kubenswrapper[4804]: W0217 13:46:57.993791 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85415d6a_8a5f_4b65_b182_2bfe221e8eee.slice/crio-70f335cc0aa83fd894a693104e67ff9d41e07158faf0aa4fa4d67a39b59c2aa3 WatchSource:0}: Error finding container 70f335cc0aa83fd894a693104e67ff9d41e07158faf0aa4fa4d67a39b59c2aa3: Status 404 returned error can't find the container with id 70f335cc0aa83fd894a693104e67ff9d41e07158faf0aa4fa4d67a39b59c2aa3 Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.149631 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 13:46:58 crc kubenswrapper[4804]: W0217 13:46:58.161819 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ec519a7_9081_4341_ad6c_c81dda70bd3a.slice/crio-ba1329d4b79ba312c3f527f0d612e3cd76cb2acbaa7d0c300741c813abd79d36 WatchSource:0}: Error finding container ba1329d4b79ba312c3f527f0d612e3cd76cb2acbaa7d0c300741c813abd79d36: Status 404 returned error can't find the container with id ba1329d4b79ba312c3f527f0d612e3cd76cb2acbaa7d0c300741c813abd79d36 Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.193712 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-k5l98"] Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.275996 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-547f989fd6-rqkvc"] Feb 17 13:46:58 crc kubenswrapper[4804]: W0217 13:46:58.283702 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2f2352e_7e9b_439f_be3c_b48b70681658.slice/crio-f722d26b35de998b04775d73a392cd120313a641cde842ae74275d679995720d WatchSource:0}: Error finding container f722d26b35de998b04775d73a392cd120313a641cde842ae74275d679995720d: Status 404 returned error can't find the container with id f722d26b35de998b04775d73a392cd120313a641cde842ae74275d679995720d Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.285834 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7kgzk" event={"ID":"96609ec5-c9e0-4611-85ff-f7dc474d889a","Type":"ContainerStarted","Data":"604b9ee7bde95746f49c889a56552a71b595a4b833acc7e18a46ed3d41181f64"} Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.285868 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7kgzk" event={"ID":"96609ec5-c9e0-4611-85ff-f7dc474d889a","Type":"ContainerStarted","Data":"7d54b7b327f8ebe87ac5109269b506e672be06f69b6bfb3774f552c367900af0"} Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.293192 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5ccd477-88cd-4284-9de7-f336def1c7a1","Type":"ContainerStarted","Data":"c4388959eec5a5e0e8de102bea2b09564ba60fc2953cdfd791f11ae067628b67"} Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.311110 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ec519a7-9081-4341-ad6c-c81dda70bd3a","Type":"ContainerStarted","Data":"ba1329d4b79ba312c3f527f0d612e3cd76cb2acbaa7d0c300741c813abd79d36"} Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.315245 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xf9m6" event={"ID":"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d","Type":"ContainerStarted","Data":"872cca29de0693ae54523b0b283408b6320b6200ca8ba4e549db427f9a5d561e"} Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.318509 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-7kgzk" podStartSLOduration=24.318489637 podStartE2EDuration="24.318489637s" podCreationTimestamp="2026-02-17 13:46:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:46:58.307628395 +0000 UTC m=+1292.419047732" watchObservedRunningTime="2026-02-17 13:46:58.318489637 +0000 UTC m=+1292.429908984" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.321579 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58989b55cb-zjfvf" event={"ID":"85415d6a-8a5f-4b65-b182-2bfe221e8eee","Type":"ContainerStarted","Data":"70f335cc0aa83fd894a693104e67ff9d41e07158faf0aa4fa4d67a39b59c2aa3"} Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.327569 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9ffb6f5c6-fczv5" event={"ID":"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f","Type":"ContainerStarted","Data":"efcc6251e72926fc927c924715b8a426c728584d969b1b7a63b8d304f8f0c323"} Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.346145 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a","Type":"ContainerStarted","Data":"b085a946a0d0c5dd1859aecc784b43e603e7ed1f79fe7a947c4f1b01db4b14a2"} Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.350296 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-xf9m6" podStartSLOduration=5.560752154 podStartE2EDuration="33.350278835s" podCreationTimestamp="2026-02-17 13:46:25 +0000 UTC" firstStartedPulling="2026-02-17 13:46:27.20337636 +0000 UTC m=+1261.314795697" lastFinishedPulling="2026-02-17 13:46:54.992903041 +0000 UTC m=+1289.104322378" observedRunningTime="2026-02-17 13:46:58.333962483 +0000 UTC m=+1292.445381830" watchObservedRunningTime="2026-02-17 13:46:58.350278835 +0000 UTC m=+1292.461698172" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.354093 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-k5l98" event={"ID":"df6e7376-a420-4a04-abf8-ab5bc3f76d7c","Type":"ContainerStarted","Data":"acc1d16ca31ae16b95fd7513bacd065031f5a80799a0b49cb8f97e1864a0396a"} Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.531610 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-77797bd57-r2gff"] Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.533548 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.540532 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-77797bd57-r2gff"] Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.545625 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.545676 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.698503 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-httpd-config\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.698543 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-ovndb-tls-certs\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.698617 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-internal-tls-certs\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.698680 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-combined-ca-bundle\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.698711 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-config\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.698765 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6642c\" (UniqueName: \"kubernetes.io/projected/3dd4a1b7-336a-4b57-a341-a413ccd8a223-kube-api-access-6642c\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.698796 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-public-tls-certs\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.800995 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-internal-tls-certs\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.801331 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-combined-ca-bundle\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.801371 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-config\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.801426 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6642c\" (UniqueName: \"kubernetes.io/projected/3dd4a1b7-336a-4b57-a341-a413ccd8a223-kube-api-access-6642c\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.801455 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-public-tls-certs\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.801480 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-httpd-config\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.801503 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-ovndb-tls-certs\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.809870 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-internal-tls-certs\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.809902 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-httpd-config\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.810567 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-config\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.811338 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-public-tls-certs\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.811887 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-combined-ca-bundle\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.815893 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-ovndb-tls-certs\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.819871 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6642c\" (UniqueName: \"kubernetes.io/projected/3dd4a1b7-336a-4b57-a341-a413ccd8a223-kube-api-access-6642c\") pod \"neutron-77797bd57-r2gff\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:58 crc kubenswrapper[4804]: I0217 13:46:58.988104 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:46:59 crc kubenswrapper[4804]: I0217 13:46:59.171318 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-hp2db" podUID="d29250cb-6c2b-4994-ba6f-f3b7239ec3e2" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: i/o timeout" Feb 17 13:46:59 crc kubenswrapper[4804]: I0217 13:46:59.375085 4804 generic.go:334] "Generic (PLEG): container finished" podID="df6e7376-a420-4a04-abf8-ab5bc3f76d7c" containerID="d94c4e75462da55817e78fa331ff2dfa0d9f3a19457e74a4ec47a997d6a72b47" exitCode=0 Feb 17 13:46:59 crc kubenswrapper[4804]: I0217 13:46:59.375224 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-k5l98" event={"ID":"df6e7376-a420-4a04-abf8-ab5bc3f76d7c","Type":"ContainerDied","Data":"d94c4e75462da55817e78fa331ff2dfa0d9f3a19457e74a4ec47a997d6a72b47"} Feb 17 13:46:59 crc kubenswrapper[4804]: I0217 13:46:59.382452 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-jz9x9" event={"ID":"19dd0c13-b898-4147-ae5f-cbc5d4915910","Type":"ContainerStarted","Data":"ac639ef1a9c58b32b3d0b2c6ada8a7a2aab1ce08a075bd944173f5c820ec7cfc"} Feb 17 13:46:59 crc kubenswrapper[4804]: I0217 13:46:59.405946 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ec519a7-9081-4341-ad6c-c81dda70bd3a","Type":"ContainerStarted","Data":"3c023d82e32da3e66e3f80b40ff960f9faffbfd6b13149e23d95974790def49f"} Feb 17 13:46:59 crc kubenswrapper[4804]: I0217 13:46:59.414411 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-jz9x9" podStartSLOduration=3.517771708 podStartE2EDuration="34.414377204s" podCreationTimestamp="2026-02-17 13:46:25 +0000 UTC" firstStartedPulling="2026-02-17 13:46:27.142357433 +0000 UTC m=+1261.253776770" lastFinishedPulling="2026-02-17 13:46:58.038962929 +0000 UTC m=+1292.150382266" observedRunningTime="2026-02-17 13:46:59.409667996 +0000 UTC m=+1293.521087333" watchObservedRunningTime="2026-02-17 13:46:59.414377204 +0000 UTC m=+1293.525796541" Feb 17 13:46:59 crc kubenswrapper[4804]: I0217 13:46:59.417606 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9ffb6f5c6-fczv5" event={"ID":"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f","Type":"ContainerStarted","Data":"3c7dfb1330433a8d2839e256e243dd44b22edc10229d43afe0d5574fd17b96aa"} Feb 17 13:46:59 crc kubenswrapper[4804]: I0217 13:46:59.423674 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-547f989fd6-rqkvc" event={"ID":"a2f2352e-7e9b-439f-be3c-b48b70681658","Type":"ContainerStarted","Data":"60e764f5ff10863b3c00346932550ad88cccb6cdd9323fc257402af064f3b3ad"} Feb 17 13:46:59 crc kubenswrapper[4804]: I0217 13:46:59.423715 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-547f989fd6-rqkvc" event={"ID":"a2f2352e-7e9b-439f-be3c-b48b70681658","Type":"ContainerStarted","Data":"8b150042424233c8bc124629fd2b08c59b6d968c4392bc674f15f7a3a798c1ef"} Feb 17 13:46:59 crc kubenswrapper[4804]: I0217 13:46:59.423749 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-547f989fd6-rqkvc" event={"ID":"a2f2352e-7e9b-439f-be3c-b48b70681658","Type":"ContainerStarted","Data":"f722d26b35de998b04775d73a392cd120313a641cde842ae74275d679995720d"} Feb 17 13:46:59 crc kubenswrapper[4804]: I0217 13:46:59.424849 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:46:59 crc kubenswrapper[4804]: I0217 13:46:59.428511 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a","Type":"ContainerStarted","Data":"96d29bd83b497a761b451864a140d1abcb104cfcfced732b3dc36a76cf94eca1"} Feb 17 13:46:59 crc kubenswrapper[4804]: I0217 13:46:59.455443 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-547f989fd6-rqkvc" podStartSLOduration=3.455425044 podStartE2EDuration="3.455425044s" podCreationTimestamp="2026-02-17 13:46:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:46:59.443578422 +0000 UTC m=+1293.554997759" watchObservedRunningTime="2026-02-17 13:46:59.455425044 +0000 UTC m=+1293.566844381" Feb 17 13:46:59 crc kubenswrapper[4804]: I0217 13:46:59.685852 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-77797bd57-r2gff"] Feb 17 13:46:59 crc kubenswrapper[4804]: W0217 13:46:59.699270 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3dd4a1b7_336a_4b57_a341_a413ccd8a223.slice/crio-b27dcb323e9a77c57f04bfd3aad2ceaaa35b5cea105117b952a32b3cda64f464 WatchSource:0}: Error finding container b27dcb323e9a77c57f04bfd3aad2ceaaa35b5cea105117b952a32b3cda64f464: Status 404 returned error can't find the container with id b27dcb323e9a77c57f04bfd3aad2ceaaa35b5cea105117b952a32b3cda64f464 Feb 17 13:47:00 crc kubenswrapper[4804]: I0217 13:47:00.438126 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77797bd57-r2gff" event={"ID":"3dd4a1b7-336a-4b57-a341-a413ccd8a223","Type":"ContainerStarted","Data":"b27dcb323e9a77c57f04bfd3aad2ceaaa35b5cea105117b952a32b3cda64f464"} Feb 17 13:47:01 crc kubenswrapper[4804]: I0217 13:47:01.449377 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-k5l98" event={"ID":"df6e7376-a420-4a04-abf8-ab5bc3f76d7c","Type":"ContainerStarted","Data":"49f322e2e5a998b03b8c1e6d24a870a2761f6b73101b84a5dad2e245b08bc968"} Feb 17 13:47:01 crc kubenswrapper[4804]: I0217 13:47:01.449989 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:47:01 crc kubenswrapper[4804]: I0217 13:47:01.452022 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ec519a7-9081-4341-ad6c-c81dda70bd3a","Type":"ContainerStarted","Data":"74e9c41b66fa02c3d94931a9817572fd799183a1707f607e072d3c3dddd9e96b"} Feb 17 13:47:01 crc kubenswrapper[4804]: I0217 13:47:01.453650 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58989b55cb-zjfvf" event={"ID":"85415d6a-8a5f-4b65-b182-2bfe221e8eee","Type":"ContainerStarted","Data":"d85afc401ad87104d844d4c1c5c56bfe2224eb996820680ca9a6f48ab88469e3"} Feb 17 13:47:01 crc kubenswrapper[4804]: I0217 13:47:01.455229 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77797bd57-r2gff" event={"ID":"3dd4a1b7-336a-4b57-a341-a413ccd8a223","Type":"ContainerStarted","Data":"c0c8c975211dcbac6c6d2312a029db9539af9a8ff4aee589c0e35e3c01c70cef"} Feb 17 13:47:01 crc kubenswrapper[4804]: I0217 13:47:01.456899 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9ffb6f5c6-fczv5" event={"ID":"e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f","Type":"ContainerStarted","Data":"c98e2e3f274ed3b6712ded3d6d3a0315988fce20ffc58176db8f6fc0c39cdb28"} Feb 17 13:47:01 crc kubenswrapper[4804]: I0217 13:47:01.459698 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a","Type":"ContainerStarted","Data":"d76559a73775110d3ee8468599e5e90ccd7d9508465099de17a00f9e4f56fb8f"} Feb 17 13:47:01 crc kubenswrapper[4804]: I0217 13:47:01.459892 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9f129188-ebe1-45c9-8c55-ff5cd08f2e8a" containerName="glance-log" containerID="cri-o://96d29bd83b497a761b451864a140d1abcb104cfcfced732b3dc36a76cf94eca1" gracePeriod=30 Feb 17 13:47:01 crc kubenswrapper[4804]: I0217 13:47:01.459928 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9f129188-ebe1-45c9-8c55-ff5cd08f2e8a" containerName="glance-httpd" containerID="cri-o://d76559a73775110d3ee8468599e5e90ccd7d9508465099de17a00f9e4f56fb8f" gracePeriod=30 Feb 17 13:47:01 crc kubenswrapper[4804]: I0217 13:47:01.473936 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7b667979-k5l98" podStartSLOduration=5.473920572 podStartE2EDuration="5.473920572s" podCreationTimestamp="2026-02-17 13:46:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:01.468991128 +0000 UTC m=+1295.580410465" watchObservedRunningTime="2026-02-17 13:47:01.473920572 +0000 UTC m=+1295.585339909" Feb 17 13:47:01 crc kubenswrapper[4804]: I0217 13:47:01.505422 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=14.505405872 podStartE2EDuration="14.505405872s" podCreationTimestamp="2026-02-17 13:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:01.492759155 +0000 UTC m=+1295.604178492" watchObservedRunningTime="2026-02-17 13:47:01.505405872 +0000 UTC m=+1295.616825209" Feb 17 13:47:01 crc kubenswrapper[4804]: I0217 13:47:01.532738 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-9ffb6f5c6-fczv5" podStartSLOduration=26.975343411 podStartE2EDuration="27.532716601s" podCreationTimestamp="2026-02-17 13:46:34 +0000 UTC" firstStartedPulling="2026-02-17 13:46:57.996167815 +0000 UTC m=+1292.107587152" lastFinishedPulling="2026-02-17 13:46:58.553541015 +0000 UTC m=+1292.664960342" observedRunningTime="2026-02-17 13:47:01.513477936 +0000 UTC m=+1295.624897273" watchObservedRunningTime="2026-02-17 13:47:01.532716601 +0000 UTC m=+1295.644135938" Feb 17 13:47:01 crc kubenswrapper[4804]: I0217 13:47:01.535942 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=27.535924412 podStartE2EDuration="27.535924412s" podCreationTimestamp="2026-02-17 13:46:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:01.532088512 +0000 UTC m=+1295.643507859" watchObservedRunningTime="2026-02-17 13:47:01.535924412 +0000 UTC m=+1295.647343749" Feb 17 13:47:02 crc kubenswrapper[4804]: I0217 13:47:02.470559 4804 generic.go:334] "Generic (PLEG): container finished" podID="14e1fc7b-0e6c-4377-b4e0-74e77e951b0d" containerID="872cca29de0693ae54523b0b283408b6320b6200ca8ba4e549db427f9a5d561e" exitCode=0 Feb 17 13:47:02 crc kubenswrapper[4804]: I0217 13:47:02.470638 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xf9m6" event={"ID":"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d","Type":"ContainerDied","Data":"872cca29de0693ae54523b0b283408b6320b6200ca8ba4e549db427f9a5d561e"} Feb 17 13:47:02 crc kubenswrapper[4804]: I0217 13:47:02.473125 4804 generic.go:334] "Generic (PLEG): container finished" podID="9f129188-ebe1-45c9-8c55-ff5cd08f2e8a" containerID="d76559a73775110d3ee8468599e5e90ccd7d9508465099de17a00f9e4f56fb8f" exitCode=0 Feb 17 13:47:02 crc kubenswrapper[4804]: I0217 13:47:02.473158 4804 generic.go:334] "Generic (PLEG): container finished" podID="9f129188-ebe1-45c9-8c55-ff5cd08f2e8a" containerID="96d29bd83b497a761b451864a140d1abcb104cfcfced732b3dc36a76cf94eca1" exitCode=143 Feb 17 13:47:02 crc kubenswrapper[4804]: I0217 13:47:02.473186 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a","Type":"ContainerDied","Data":"d76559a73775110d3ee8468599e5e90ccd7d9508465099de17a00f9e4f56fb8f"} Feb 17 13:47:02 crc kubenswrapper[4804]: I0217 13:47:02.473298 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a","Type":"ContainerDied","Data":"96d29bd83b497a761b451864a140d1abcb104cfcfced732b3dc36a76cf94eca1"} Feb 17 13:47:02 crc kubenswrapper[4804]: I0217 13:47:02.475076 4804 generic.go:334] "Generic (PLEG): container finished" podID="96609ec5-c9e0-4611-85ff-f7dc474d889a" containerID="604b9ee7bde95746f49c889a56552a71b595a4b833acc7e18a46ed3d41181f64" exitCode=0 Feb 17 13:47:02 crc kubenswrapper[4804]: I0217 13:47:02.475224 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7kgzk" event={"ID":"96609ec5-c9e0-4611-85ff-f7dc474d889a","Type":"ContainerDied","Data":"604b9ee7bde95746f49c889a56552a71b595a4b833acc7e18a46ed3d41181f64"} Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.396622 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.396897 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.520999 4804 generic.go:334] "Generic (PLEG): container finished" podID="19dd0c13-b898-4147-ae5f-cbc5d4915910" containerID="ac639ef1a9c58b32b3d0b2c6ada8a7a2aab1ce08a075bd944173f5c820ec7cfc" exitCode=0 Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.521046 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-jz9x9" event={"ID":"19dd0c13-b898-4147-ae5f-cbc5d4915910","Type":"ContainerDied","Data":"ac639ef1a9c58b32b3d0b2c6ada8a7a2aab1ce08a075bd944173f5c820ec7cfc"} Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.741087 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.753639 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xf9m6" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.824888 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snn6s\" (UniqueName: \"kubernetes.io/projected/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-kube-api-access-snn6s\") pod \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\" (UID: \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\") " Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.824946 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-combined-ca-bundle\") pod \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\" (UID: \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\") " Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.824982 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-combined-ca-bundle\") pod \"96609ec5-c9e0-4611-85ff-f7dc474d889a\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.825019 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-scripts\") pod \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\" (UID: \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\") " Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.825052 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-credential-keys\") pod \"96609ec5-c9e0-4611-85ff-f7dc474d889a\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.825117 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-config-data\") pod \"96609ec5-c9e0-4611-85ff-f7dc474d889a\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.825183 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-fernet-keys\") pod \"96609ec5-c9e0-4611-85ff-f7dc474d889a\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.825236 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-config-data\") pod \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\" (UID: \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\") " Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.825282 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhl8p\" (UniqueName: \"kubernetes.io/projected/96609ec5-c9e0-4611-85ff-f7dc474d889a-kube-api-access-rhl8p\") pod \"96609ec5-c9e0-4611-85ff-f7dc474d889a\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.825324 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-logs\") pod \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\" (UID: \"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d\") " Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.825359 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-scripts\") pod \"96609ec5-c9e0-4611-85ff-f7dc474d889a\" (UID: \"96609ec5-c9e0-4611-85ff-f7dc474d889a\") " Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.828603 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-logs" (OuterVolumeSpecName: "logs") pod "14e1fc7b-0e6c-4377-b4e0-74e77e951b0d" (UID: "14e1fc7b-0e6c-4377-b4e0-74e77e951b0d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.833293 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-scripts" (OuterVolumeSpecName: "scripts") pod "14e1fc7b-0e6c-4377-b4e0-74e77e951b0d" (UID: "14e1fc7b-0e6c-4377-b4e0-74e77e951b0d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.843370 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "96609ec5-c9e0-4611-85ff-f7dc474d889a" (UID: "96609ec5-c9e0-4611-85ff-f7dc474d889a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.843444 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "96609ec5-c9e0-4611-85ff-f7dc474d889a" (UID: "96609ec5-c9e0-4611-85ff-f7dc474d889a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.843447 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96609ec5-c9e0-4611-85ff-f7dc474d889a-kube-api-access-rhl8p" (OuterVolumeSpecName: "kube-api-access-rhl8p") pod "96609ec5-c9e0-4611-85ff-f7dc474d889a" (UID: "96609ec5-c9e0-4611-85ff-f7dc474d889a"). InnerVolumeSpecName "kube-api-access-rhl8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.847843 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-scripts" (OuterVolumeSpecName: "scripts") pod "96609ec5-c9e0-4611-85ff-f7dc474d889a" (UID: "96609ec5-c9e0-4611-85ff-f7dc474d889a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.847953 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-kube-api-access-snn6s" (OuterVolumeSpecName: "kube-api-access-snn6s") pod "14e1fc7b-0e6c-4377-b4e0-74e77e951b0d" (UID: "14e1fc7b-0e6c-4377-b4e0-74e77e951b0d"). InnerVolumeSpecName "kube-api-access-snn6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.867313 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-config-data" (OuterVolumeSpecName: "config-data") pod "14e1fc7b-0e6c-4377-b4e0-74e77e951b0d" (UID: "14e1fc7b-0e6c-4377-b4e0-74e77e951b0d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.877610 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-config-data" (OuterVolumeSpecName: "config-data") pod "96609ec5-c9e0-4611-85ff-f7dc474d889a" (UID: "96609ec5-c9e0-4611-85ff-f7dc474d889a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.878810 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.878892 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.899830 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "96609ec5-c9e0-4611-85ff-f7dc474d889a" (UID: "96609ec5-c9e0-4611-85ff-f7dc474d889a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.906547 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14e1fc7b-0e6c-4377-b4e0-74e77e951b0d" (UID: "14e1fc7b-0e6c-4377-b4e0-74e77e951b0d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.936471 4804 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.936510 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.936525 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhl8p\" (UniqueName: \"kubernetes.io/projected/96609ec5-c9e0-4611-85ff-f7dc474d889a-kube-api-access-rhl8p\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.936539 4804 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-logs\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.936548 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.936560 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snn6s\" (UniqueName: \"kubernetes.io/projected/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-kube-api-access-snn6s\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.936570 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.936580 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.936591 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.936603 4804 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:04 crc kubenswrapper[4804]: I0217 13:47:04.936615 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96609ec5-c9e0-4611-85ff-f7dc474d889a-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.119257 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.245937 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-combined-ca-bundle\") pod \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.246009 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-public-tls-certs\") pod \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.246034 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.246056 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-scripts\") pod \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.246075 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-config-data\") pod \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.246111 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrhpb\" (UniqueName: \"kubernetes.io/projected/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-kube-api-access-rrhpb\") pod \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.246135 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-httpd-run\") pod \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.246246 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-logs\") pod \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\" (UID: \"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a\") " Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.246987 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-logs" (OuterVolumeSpecName: "logs") pod "9f129188-ebe1-45c9-8c55-ff5cd08f2e8a" (UID: "9f129188-ebe1-45c9-8c55-ff5cd08f2e8a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.250598 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9f129188-ebe1-45c9-8c55-ff5cd08f2e8a" (UID: "9f129188-ebe1-45c9-8c55-ff5cd08f2e8a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.250812 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-scripts" (OuterVolumeSpecName: "scripts") pod "9f129188-ebe1-45c9-8c55-ff5cd08f2e8a" (UID: "9f129188-ebe1-45c9-8c55-ff5cd08f2e8a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.266492 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-kube-api-access-rrhpb" (OuterVolumeSpecName: "kube-api-access-rrhpb") pod "9f129188-ebe1-45c9-8c55-ff5cd08f2e8a" (UID: "9f129188-ebe1-45c9-8c55-ff5cd08f2e8a"). InnerVolumeSpecName "kube-api-access-rrhpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.272037 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "9f129188-ebe1-45c9-8c55-ff5cd08f2e8a" (UID: "9f129188-ebe1-45c9-8c55-ff5cd08f2e8a"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.295514 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f129188-ebe1-45c9-8c55-ff5cd08f2e8a" (UID: "9f129188-ebe1-45c9-8c55-ff5cd08f2e8a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.313752 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9f129188-ebe1-45c9-8c55-ff5cd08f2e8a" (UID: "9f129188-ebe1-45c9-8c55-ff5cd08f2e8a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.319376 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-config-data" (OuterVolumeSpecName: "config-data") pod "9f129188-ebe1-45c9-8c55-ff5cd08f2e8a" (UID: "9f129188-ebe1-45c9-8c55-ff5cd08f2e8a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.348244 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrhpb\" (UniqueName: \"kubernetes.io/projected/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-kube-api-access-rrhpb\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.348277 4804 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.348302 4804 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-logs\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.348316 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.348325 4804 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.348358 4804 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.348382 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.348391 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.366588 4804 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.450325 4804 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.533666 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xf9m6" event={"ID":"14e1fc7b-0e6c-4377-b4e0-74e77e951b0d","Type":"ContainerDied","Data":"ff3a70443dcb1450d10696c24e8f78a38b035a2406676a888b6d8c4f0796ab75"} Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.533710 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff3a70443dcb1450d10696c24e8f78a38b035a2406676a888b6d8c4f0796ab75" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.534113 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xf9m6" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.535864 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58989b55cb-zjfvf" event={"ID":"85415d6a-8a5f-4b65-b182-2bfe221e8eee","Type":"ContainerStarted","Data":"c565845aca9ef2b15231e4cf93626b2f7262c579528562e984d56c20dda93983"} Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.544910 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77797bd57-r2gff" event={"ID":"3dd4a1b7-336a-4b57-a341-a413ccd8a223","Type":"ContainerStarted","Data":"b1a36e61c5f168281044a5910102bb23fe1a6fd32062c6f2f8cfa71c080e8e7c"} Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.545064 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.549031 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9f129188-ebe1-45c9-8c55-ff5cd08f2e8a","Type":"ContainerDied","Data":"b085a946a0d0c5dd1859aecc784b43e603e7ed1f79fe7a947c4f1b01db4b14a2"} Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.549081 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.549121 4804 scope.go:117] "RemoveContainer" containerID="d76559a73775110d3ee8468599e5e90ccd7d9508465099de17a00f9e4f56fb8f" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.550749 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7kgzk" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.550761 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7kgzk" event={"ID":"96609ec5-c9e0-4611-85ff-f7dc474d889a","Type":"ContainerDied","Data":"7d54b7b327f8ebe87ac5109269b506e672be06f69b6bfb3774f552c367900af0"} Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.550848 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d54b7b327f8ebe87ac5109269b506e672be06f69b6bfb3774f552c367900af0" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.563267 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-58989b55cb-zjfvf" podStartSLOduration=30.614830103 podStartE2EDuration="31.563244305s" podCreationTimestamp="2026-02-17 13:46:34 +0000 UTC" firstStartedPulling="2026-02-17 13:46:57.995752222 +0000 UTC m=+1292.107171559" lastFinishedPulling="2026-02-17 13:46:58.944166424 +0000 UTC m=+1293.055585761" observedRunningTime="2026-02-17 13:47:05.56085359 +0000 UTC m=+1299.672272927" watchObservedRunningTime="2026-02-17 13:47:05.563244305 +0000 UTC m=+1299.674663662" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.563659 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5ccd477-88cd-4284-9de7-f336def1c7a1","Type":"ContainerStarted","Data":"c40d28d17d3f2c54c12b4cb0de4103cece798c21dc1c74faef6b897bc022c5c5"} Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.583872 4804 scope.go:117] "RemoveContainer" containerID="96d29bd83b497a761b451864a140d1abcb104cfcfced732b3dc36a76cf94eca1" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.604971 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-77797bd57-r2gff" podStartSLOduration=7.604950856 podStartE2EDuration="7.604950856s" podCreationTimestamp="2026-02-17 13:46:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:05.591148962 +0000 UTC m=+1299.702568299" watchObservedRunningTime="2026-02-17 13:47:05.604950856 +0000 UTC m=+1299.716370193" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.634380 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.694452 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.732819 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:47:05 crc kubenswrapper[4804]: E0217 13:47:05.733227 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96609ec5-c9e0-4611-85ff-f7dc474d889a" containerName="keystone-bootstrap" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.733241 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="96609ec5-c9e0-4611-85ff-f7dc474d889a" containerName="keystone-bootstrap" Feb 17 13:47:05 crc kubenswrapper[4804]: E0217 13:47:05.733255 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f129188-ebe1-45c9-8c55-ff5cd08f2e8a" containerName="glance-httpd" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.733260 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f129188-ebe1-45c9-8c55-ff5cd08f2e8a" containerName="glance-httpd" Feb 17 13:47:05 crc kubenswrapper[4804]: E0217 13:47:05.733278 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f129188-ebe1-45c9-8c55-ff5cd08f2e8a" containerName="glance-log" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.733284 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f129188-ebe1-45c9-8c55-ff5cd08f2e8a" containerName="glance-log" Feb 17 13:47:05 crc kubenswrapper[4804]: E0217 13:47:05.733300 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14e1fc7b-0e6c-4377-b4e0-74e77e951b0d" containerName="placement-db-sync" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.733306 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="14e1fc7b-0e6c-4377-b4e0-74e77e951b0d" containerName="placement-db-sync" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.733476 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f129188-ebe1-45c9-8c55-ff5cd08f2e8a" containerName="glance-log" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.733489 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f129188-ebe1-45c9-8c55-ff5cd08f2e8a" containerName="glance-httpd" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.733499 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="96609ec5-c9e0-4611-85ff-f7dc474d889a" containerName="keystone-bootstrap" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.733510 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="14e1fc7b-0e6c-4377-b4e0-74e77e951b0d" containerName="placement-db-sync" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.734389 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.737161 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.737252 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.742625 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.868068 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-9cc757857-wng6k"] Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.869753 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.880040 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.881050 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.881249 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.881305 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.881462 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.881522 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-2fq28" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.884486 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.884553 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.884579 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.884596 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-config-data\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.884645 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.884671 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-logs\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.884690 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9djck\" (UniqueName: \"kubernetes.io/projected/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-kube-api-access-9djck\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.884710 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-scripts\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.919305 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-9cc757857-wng6k"] Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.985089 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-67b49bc6f6-6kg64"] Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.986964 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.987051 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-credential-keys\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.987102 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-public-tls-certs\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.987229 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-scripts\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.987254 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-internal-tls-certs\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.987457 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.987546 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf547\" (UniqueName: \"kubernetes.io/projected/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-kube-api-access-xf547\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.987592 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.987623 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-config-data\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.987959 4804 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Feb 17 13:47:05 crc kubenswrapper[4804]: I0217 13:47:05.988753 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-fernet-keys\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:05.992976 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:05.993917 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-combined-ca-bundle\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:05.994093 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:05.994124 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-config-data\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:05.994217 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-logs\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:05.994359 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9djck\" (UniqueName: \"kubernetes.io/projected/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-kube-api-access-9djck\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:05.994418 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-scripts\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:05.997934 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:05.998319 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-logs\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.005424 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-scripts\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.012949 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-config-data\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.022552 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.040366 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-67b49bc6f6-6kg64"] Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.040648 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.043794 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.044008 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.044183 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.045524 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.047936 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-dr6jm" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.061836 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9djck\" (UniqueName: \"kubernetes.io/projected/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-kube-api-access-9djck\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.073547 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.089131 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.100355 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-public-tls-certs\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.100421 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-scripts\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.100451 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-internal-tls-certs\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.100515 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf547\" (UniqueName: \"kubernetes.io/projected/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-kube-api-access-xf547\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.100562 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-fernet-keys\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.100607 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-combined-ca-bundle\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.100659 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-config-data\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.100759 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-credential-keys\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.111853 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-credential-keys\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.112647 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-scripts\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.113243 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-public-tls-certs\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.121728 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-internal-tls-certs\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.125510 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-config-data\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.127745 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-combined-ca-bundle\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.129758 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-fernet-keys\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.136986 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf547\" (UniqueName: \"kubernetes.io/projected/30df70d3-9323-4ddd-9d1c-2dae72cff6d9-kube-api-access-xf547\") pod \"keystone-9cc757857-wng6k\" (UID: \"30df70d3-9323-4ddd-9d1c-2dae72cff6d9\") " pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.200652 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.207629 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-combined-ca-bundle\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.207727 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsgq9\" (UniqueName: \"kubernetes.io/projected/8c441055-8615-497e-8754-d107b3be24c7-kube-api-access-zsgq9\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.207921 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-public-tls-certs\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.208000 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-scripts\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.208051 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-config-data\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.208232 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-internal-tls-certs\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.208335 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c441055-8615-497e-8754-d107b3be24c7-logs\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.251979 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-jz9x9" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.255991 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6d69649784-lnwhw"] Feb 17 13:47:06 crc kubenswrapper[4804]: E0217 13:47:06.256481 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19dd0c13-b898-4147-ae5f-cbc5d4915910" containerName="barbican-db-sync" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.256499 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="19dd0c13-b898-4147-ae5f-cbc5d4915910" containerName="barbican-db-sync" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.256773 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="19dd0c13-b898-4147-ae5f-cbc5d4915910" containerName="barbican-db-sync" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.257941 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.275453 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6d69649784-lnwhw"] Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.310516 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-internal-tls-certs\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.312519 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c441055-8615-497e-8754-d107b3be24c7-logs\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.312823 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-combined-ca-bundle\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.312978 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsgq9\" (UniqueName: \"kubernetes.io/projected/8c441055-8615-497e-8754-d107b3be24c7-kube-api-access-zsgq9\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.313302 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-public-tls-certs\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.314431 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c441055-8615-497e-8754-d107b3be24c7-logs\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.315434 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-scripts\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.316012 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-config-data\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.318887 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-public-tls-certs\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.319041 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-combined-ca-bundle\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.320788 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-scripts\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.320998 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-config-data\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.329605 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-internal-tls-certs\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.334883 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsgq9\" (UniqueName: \"kubernetes.io/projected/8c441055-8615-497e-8754-d107b3be24c7-kube-api-access-zsgq9\") pod \"placement-67b49bc6f6-6kg64\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.420993 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19dd0c13-b898-4147-ae5f-cbc5d4915910-combined-ca-bundle\") pod \"19dd0c13-b898-4147-ae5f-cbc5d4915910\" (UID: \"19dd0c13-b898-4147-ae5f-cbc5d4915910\") " Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.421083 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v577c\" (UniqueName: \"kubernetes.io/projected/19dd0c13-b898-4147-ae5f-cbc5d4915910-kube-api-access-v577c\") pod \"19dd0c13-b898-4147-ae5f-cbc5d4915910\" (UID: \"19dd0c13-b898-4147-ae5f-cbc5d4915910\") " Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.421115 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/19dd0c13-b898-4147-ae5f-cbc5d4915910-db-sync-config-data\") pod \"19dd0c13-b898-4147-ae5f-cbc5d4915910\" (UID: \"19dd0c13-b898-4147-ae5f-cbc5d4915910\") " Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.421344 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkl6h\" (UniqueName: \"kubernetes.io/projected/858d67cb-268b-4724-bba9-a7ab9a10ed6c-kube-api-access-kkl6h\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.421434 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/858d67cb-268b-4724-bba9-a7ab9a10ed6c-internal-tls-certs\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.421520 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/858d67cb-268b-4724-bba9-a7ab9a10ed6c-public-tls-certs\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.421560 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/858d67cb-268b-4724-bba9-a7ab9a10ed6c-config-data\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.421590 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/858d67cb-268b-4724-bba9-a7ab9a10ed6c-scripts\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.421633 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/858d67cb-268b-4724-bba9-a7ab9a10ed6c-logs\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.421668 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/858d67cb-268b-4724-bba9-a7ab9a10ed6c-combined-ca-bundle\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.434655 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19dd0c13-b898-4147-ae5f-cbc5d4915910-kube-api-access-v577c" (OuterVolumeSpecName: "kube-api-access-v577c") pod "19dd0c13-b898-4147-ae5f-cbc5d4915910" (UID: "19dd0c13-b898-4147-ae5f-cbc5d4915910"). InnerVolumeSpecName "kube-api-access-v577c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.434786 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19dd0c13-b898-4147-ae5f-cbc5d4915910-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "19dd0c13-b898-4147-ae5f-cbc5d4915910" (UID: "19dd0c13-b898-4147-ae5f-cbc5d4915910"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.471749 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19dd0c13-b898-4147-ae5f-cbc5d4915910-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19dd0c13-b898-4147-ae5f-cbc5d4915910" (UID: "19dd0c13-b898-4147-ae5f-cbc5d4915910"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.523862 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/858d67cb-268b-4724-bba9-a7ab9a10ed6c-public-tls-certs\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.524151 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/858d67cb-268b-4724-bba9-a7ab9a10ed6c-config-data\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.524178 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/858d67cb-268b-4724-bba9-a7ab9a10ed6c-scripts\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.524214 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/858d67cb-268b-4724-bba9-a7ab9a10ed6c-logs\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.524244 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/858d67cb-268b-4724-bba9-a7ab9a10ed6c-combined-ca-bundle\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.524274 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkl6h\" (UniqueName: \"kubernetes.io/projected/858d67cb-268b-4724-bba9-a7ab9a10ed6c-kube-api-access-kkl6h\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.524316 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/858d67cb-268b-4724-bba9-a7ab9a10ed6c-internal-tls-certs\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.524385 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19dd0c13-b898-4147-ae5f-cbc5d4915910-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.524397 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v577c\" (UniqueName: \"kubernetes.io/projected/19dd0c13-b898-4147-ae5f-cbc5d4915910-kube-api-access-v577c\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.524407 4804 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/19dd0c13-b898-4147-ae5f-cbc5d4915910-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.524920 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/858d67cb-268b-4724-bba9-a7ab9a10ed6c-logs\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.528495 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/858d67cb-268b-4724-bba9-a7ab9a10ed6c-scripts\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.529057 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/858d67cb-268b-4724-bba9-a7ab9a10ed6c-public-tls-certs\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.529084 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/858d67cb-268b-4724-bba9-a7ab9a10ed6c-internal-tls-certs\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.529727 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/858d67cb-268b-4724-bba9-a7ab9a10ed6c-config-data\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.532180 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/858d67cb-268b-4724-bba9-a7ab9a10ed6c-combined-ca-bundle\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.541485 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkl6h\" (UniqueName: \"kubernetes.io/projected/858d67cb-268b-4724-bba9-a7ab9a10ed6c-kube-api-access-kkl6h\") pod \"placement-6d69649784-lnwhw\" (UID: \"858d67cb-268b-4724-bba9-a7ab9a10ed6c\") " pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.549157 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.600166 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f129188-ebe1-45c9-8c55-ff5cd08f2e8a" path="/var/lib/kubelet/pods/9f129188-ebe1-45c9-8c55-ff5cd08f2e8a/volumes" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.614963 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-jz9x9" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.615026 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-jz9x9" event={"ID":"19dd0c13-b898-4147-ae5f-cbc5d4915910","Type":"ContainerDied","Data":"3d2919ea140840d1d3c9f9481431391d06220a9f53b8b9586d077d9974ea9874"} Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.615051 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d2919ea140840d1d3c9f9481431391d06220a9f53b8b9586d077d9974ea9874" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.617563 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.720979 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.762508 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5f97f9545f-tngcj"] Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.767422 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5f97f9545f-tngcj" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.768986 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5f97f9545f-tngcj"] Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.781356 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-6zhqd" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.781700 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.782857 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.822450 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-9cc757857-wng6k"] Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.831260 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7f4e4c3-9ec8-4923-bf7b-4058899e863f-combined-ca-bundle\") pod \"barbican-worker-5f97f9545f-tngcj\" (UID: \"c7f4e4c3-9ec8-4923-bf7b-4058899e863f\") " pod="openstack/barbican-worker-5f97f9545f-tngcj" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.831333 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7f4e4c3-9ec8-4923-bf7b-4058899e863f-config-data\") pod \"barbican-worker-5f97f9545f-tngcj\" (UID: \"c7f4e4c3-9ec8-4923-bf7b-4058899e863f\") " pod="openstack/barbican-worker-5f97f9545f-tngcj" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.831375 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h62hk\" (UniqueName: \"kubernetes.io/projected/c7f4e4c3-9ec8-4923-bf7b-4058899e863f-kube-api-access-h62hk\") pod \"barbican-worker-5f97f9545f-tngcj\" (UID: \"c7f4e4c3-9ec8-4923-bf7b-4058899e863f\") " pod="openstack/barbican-worker-5f97f9545f-tngcj" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.831398 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7f4e4c3-9ec8-4923-bf7b-4058899e863f-logs\") pod \"barbican-worker-5f97f9545f-tngcj\" (UID: \"c7f4e4c3-9ec8-4923-bf7b-4058899e863f\") " pod="openstack/barbican-worker-5f97f9545f-tngcj" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.831442 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7f4e4c3-9ec8-4923-bf7b-4058899e863f-config-data-custom\") pod \"barbican-worker-5f97f9545f-tngcj\" (UID: \"c7f4e4c3-9ec8-4923-bf7b-4058899e863f\") " pod="openstack/barbican-worker-5f97f9545f-tngcj" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.863598 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.885849 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-f46489f4-x24zj"] Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.918930 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-f46489f4-x24zj" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.922458 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.932982 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-f46489f4-x24zj"] Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.933519 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7f4e4c3-9ec8-4923-bf7b-4058899e863f-combined-ca-bundle\") pod \"barbican-worker-5f97f9545f-tngcj\" (UID: \"c7f4e4c3-9ec8-4923-bf7b-4058899e863f\") " pod="openstack/barbican-worker-5f97f9545f-tngcj" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.933577 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7f4e4c3-9ec8-4923-bf7b-4058899e863f-config-data\") pod \"barbican-worker-5f97f9545f-tngcj\" (UID: \"c7f4e4c3-9ec8-4923-bf7b-4058899e863f\") " pod="openstack/barbican-worker-5f97f9545f-tngcj" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.933599 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h62hk\" (UniqueName: \"kubernetes.io/projected/c7f4e4c3-9ec8-4923-bf7b-4058899e863f-kube-api-access-h62hk\") pod \"barbican-worker-5f97f9545f-tngcj\" (UID: \"c7f4e4c3-9ec8-4923-bf7b-4058899e863f\") " pod="openstack/barbican-worker-5f97f9545f-tngcj" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.933616 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7f4e4c3-9ec8-4923-bf7b-4058899e863f-logs\") pod \"barbican-worker-5f97f9545f-tngcj\" (UID: \"c7f4e4c3-9ec8-4923-bf7b-4058899e863f\") " pod="openstack/barbican-worker-5f97f9545f-tngcj" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.933636 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7f4e4c3-9ec8-4923-bf7b-4058899e863f-config-data-custom\") pod \"barbican-worker-5f97f9545f-tngcj\" (UID: \"c7f4e4c3-9ec8-4923-bf7b-4058899e863f\") " pod="openstack/barbican-worker-5f97f9545f-tngcj" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.934351 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7f4e4c3-9ec8-4923-bf7b-4058899e863f-logs\") pod \"barbican-worker-5f97f9545f-tngcj\" (UID: \"c7f4e4c3-9ec8-4923-bf7b-4058899e863f\") " pod="openstack/barbican-worker-5f97f9545f-tngcj" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.942836 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7f4e4c3-9ec8-4923-bf7b-4058899e863f-combined-ca-bundle\") pod \"barbican-worker-5f97f9545f-tngcj\" (UID: \"c7f4e4c3-9ec8-4923-bf7b-4058899e863f\") " pod="openstack/barbican-worker-5f97f9545f-tngcj" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.943603 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7f4e4c3-9ec8-4923-bf7b-4058899e863f-config-data\") pod \"barbican-worker-5f97f9545f-tngcj\" (UID: \"c7f4e4c3-9ec8-4923-bf7b-4058899e863f\") " pod="openstack/barbican-worker-5f97f9545f-tngcj" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.945617 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7f4e4c3-9ec8-4923-bf7b-4058899e863f-config-data-custom\") pod \"barbican-worker-5f97f9545f-tngcj\" (UID: \"c7f4e4c3-9ec8-4923-bf7b-4058899e863f\") " pod="openstack/barbican-worker-5f97f9545f-tngcj" Feb 17 13:47:06 crc kubenswrapper[4804]: I0217 13:47:06.963863 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h62hk\" (UniqueName: \"kubernetes.io/projected/c7f4e4c3-9ec8-4923-bf7b-4058899e863f-kube-api-access-h62hk\") pod \"barbican-worker-5f97f9545f-tngcj\" (UID: \"c7f4e4c3-9ec8-4923-bf7b-4058899e863f\") " pod="openstack/barbican-worker-5f97f9545f-tngcj" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.033900 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-k5l98"] Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.035418 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/297a0648-3cbd-4f1e-8bc4-d918a702c33b-logs\") pod \"barbican-keystone-listener-f46489f4-x24zj\" (UID: \"297a0648-3cbd-4f1e-8bc4-d918a702c33b\") " pod="openstack/barbican-keystone-listener-f46489f4-x24zj" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.035458 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/297a0648-3cbd-4f1e-8bc4-d918a702c33b-combined-ca-bundle\") pod \"barbican-keystone-listener-f46489f4-x24zj\" (UID: \"297a0648-3cbd-4f1e-8bc4-d918a702c33b\") " pod="openstack/barbican-keystone-listener-f46489f4-x24zj" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.035487 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/297a0648-3cbd-4f1e-8bc4-d918a702c33b-config-data\") pod \"barbican-keystone-listener-f46489f4-x24zj\" (UID: \"297a0648-3cbd-4f1e-8bc4-d918a702c33b\") " pod="openstack/barbican-keystone-listener-f46489f4-x24zj" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.035586 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w29cv\" (UniqueName: \"kubernetes.io/projected/297a0648-3cbd-4f1e-8bc4-d918a702c33b-kube-api-access-w29cv\") pod \"barbican-keystone-listener-f46489f4-x24zj\" (UID: \"297a0648-3cbd-4f1e-8bc4-d918a702c33b\") " pod="openstack/barbican-keystone-listener-f46489f4-x24zj" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.035609 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/297a0648-3cbd-4f1e-8bc4-d918a702c33b-config-data-custom\") pod \"barbican-keystone-listener-f46489f4-x24zj\" (UID: \"297a0648-3cbd-4f1e-8bc4-d918a702c33b\") " pod="openstack/barbican-keystone-listener-f46489f4-x24zj" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.065809 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-cxg2s"] Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.067286 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.085552 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-cxg2s"] Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.094300 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6955855558-kv2ld"] Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.095905 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.099286 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.122452 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5f97f9545f-tngcj" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.136049 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6955855558-kv2ld"] Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.139241 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-config\") pod \"dnsmasq-dns-848cf88cfc-cxg2s\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.139410 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fgvs\" (UniqueName: \"kubernetes.io/projected/b85d5058-b075-42ca-8d69-a86cfc1bd01c-kube-api-access-4fgvs\") pod \"dnsmasq-dns-848cf88cfc-cxg2s\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.139465 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-cxg2s\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.139516 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w29cv\" (UniqueName: \"kubernetes.io/projected/297a0648-3cbd-4f1e-8bc4-d918a702c33b-kube-api-access-w29cv\") pod \"barbican-keystone-listener-f46489f4-x24zj\" (UID: \"297a0648-3cbd-4f1e-8bc4-d918a702c33b\") " pod="openstack/barbican-keystone-listener-f46489f4-x24zj" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.139543 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-cxg2s\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.139583 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/297a0648-3cbd-4f1e-8bc4-d918a702c33b-config-data-custom\") pod \"barbican-keystone-listener-f46489f4-x24zj\" (UID: \"297a0648-3cbd-4f1e-8bc4-d918a702c33b\") " pod="openstack/barbican-keystone-listener-f46489f4-x24zj" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.139636 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-cxg2s\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.139713 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/297a0648-3cbd-4f1e-8bc4-d918a702c33b-logs\") pod \"barbican-keystone-listener-f46489f4-x24zj\" (UID: \"297a0648-3cbd-4f1e-8bc4-d918a702c33b\") " pod="openstack/barbican-keystone-listener-f46489f4-x24zj" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.139745 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/297a0648-3cbd-4f1e-8bc4-d918a702c33b-combined-ca-bundle\") pod \"barbican-keystone-listener-f46489f4-x24zj\" (UID: \"297a0648-3cbd-4f1e-8bc4-d918a702c33b\") " pod="openstack/barbican-keystone-listener-f46489f4-x24zj" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.139773 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-cxg2s\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.139798 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/297a0648-3cbd-4f1e-8bc4-d918a702c33b-config-data\") pod \"barbican-keystone-listener-f46489f4-x24zj\" (UID: \"297a0648-3cbd-4f1e-8bc4-d918a702c33b\") " pod="openstack/barbican-keystone-listener-f46489f4-x24zj" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.141328 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/297a0648-3cbd-4f1e-8bc4-d918a702c33b-logs\") pod \"barbican-keystone-listener-f46489f4-x24zj\" (UID: \"297a0648-3cbd-4f1e-8bc4-d918a702c33b\") " pod="openstack/barbican-keystone-listener-f46489f4-x24zj" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.148808 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/297a0648-3cbd-4f1e-8bc4-d918a702c33b-config-data\") pod \"barbican-keystone-listener-f46489f4-x24zj\" (UID: \"297a0648-3cbd-4f1e-8bc4-d918a702c33b\") " pod="openstack/barbican-keystone-listener-f46489f4-x24zj" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.173600 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/297a0648-3cbd-4f1e-8bc4-d918a702c33b-combined-ca-bundle\") pod \"barbican-keystone-listener-f46489f4-x24zj\" (UID: \"297a0648-3cbd-4f1e-8bc4-d918a702c33b\") " pod="openstack/barbican-keystone-listener-f46489f4-x24zj" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.176922 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w29cv\" (UniqueName: \"kubernetes.io/projected/297a0648-3cbd-4f1e-8bc4-d918a702c33b-kube-api-access-w29cv\") pod \"barbican-keystone-listener-f46489f4-x24zj\" (UID: \"297a0648-3cbd-4f1e-8bc4-d918a702c33b\") " pod="openstack/barbican-keystone-listener-f46489f4-x24zj" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.178529 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/297a0648-3cbd-4f1e-8bc4-d918a702c33b-config-data-custom\") pod \"barbican-keystone-listener-f46489f4-x24zj\" (UID: \"297a0648-3cbd-4f1e-8bc4-d918a702c33b\") " pod="openstack/barbican-keystone-listener-f46489f4-x24zj" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.241681 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-cxg2s\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.241769 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-cxg2s\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.241808 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-cxg2s\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.241837 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks729\" (UniqueName: \"kubernetes.io/projected/410af4be-4a66-404d-9809-f58444bc6473-kube-api-access-ks729\") pod \"barbican-api-6955855558-kv2ld\" (UID: \"410af4be-4a66-404d-9809-f58444bc6473\") " pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.241889 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/410af4be-4a66-404d-9809-f58444bc6473-combined-ca-bundle\") pod \"barbican-api-6955855558-kv2ld\" (UID: \"410af4be-4a66-404d-9809-f58444bc6473\") " pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.241912 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/410af4be-4a66-404d-9809-f58444bc6473-logs\") pod \"barbican-api-6955855558-kv2ld\" (UID: \"410af4be-4a66-404d-9809-f58444bc6473\") " pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.241930 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/410af4be-4a66-404d-9809-f58444bc6473-config-data\") pod \"barbican-api-6955855558-kv2ld\" (UID: \"410af4be-4a66-404d-9809-f58444bc6473\") " pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.241962 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-config\") pod \"dnsmasq-dns-848cf88cfc-cxg2s\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.241984 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/410af4be-4a66-404d-9809-f58444bc6473-config-data-custom\") pod \"barbican-api-6955855558-kv2ld\" (UID: \"410af4be-4a66-404d-9809-f58444bc6473\") " pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.242018 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fgvs\" (UniqueName: \"kubernetes.io/projected/b85d5058-b075-42ca-8d69-a86cfc1bd01c-kube-api-access-4fgvs\") pod \"dnsmasq-dns-848cf88cfc-cxg2s\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.242044 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-cxg2s\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.242870 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-cxg2s\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.243924 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-cxg2s\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.243995 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-cxg2s\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.246184 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-cxg2s\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.247057 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-config\") pod \"dnsmasq-dns-848cf88cfc-cxg2s\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.270651 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fgvs\" (UniqueName: \"kubernetes.io/projected/b85d5058-b075-42ca-8d69-a86cfc1bd01c-kube-api-access-4fgvs\") pod \"dnsmasq-dns-848cf88cfc-cxg2s\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.343945 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/410af4be-4a66-404d-9809-f58444bc6473-combined-ca-bundle\") pod \"barbican-api-6955855558-kv2ld\" (UID: \"410af4be-4a66-404d-9809-f58444bc6473\") " pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.344262 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/410af4be-4a66-404d-9809-f58444bc6473-logs\") pod \"barbican-api-6955855558-kv2ld\" (UID: \"410af4be-4a66-404d-9809-f58444bc6473\") " pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.344287 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/410af4be-4a66-404d-9809-f58444bc6473-config-data\") pod \"barbican-api-6955855558-kv2ld\" (UID: \"410af4be-4a66-404d-9809-f58444bc6473\") " pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.344335 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/410af4be-4a66-404d-9809-f58444bc6473-config-data-custom\") pod \"barbican-api-6955855558-kv2ld\" (UID: \"410af4be-4a66-404d-9809-f58444bc6473\") " pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.344510 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks729\" (UniqueName: \"kubernetes.io/projected/410af4be-4a66-404d-9809-f58444bc6473-kube-api-access-ks729\") pod \"barbican-api-6955855558-kv2ld\" (UID: \"410af4be-4a66-404d-9809-f58444bc6473\") " pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.345311 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/410af4be-4a66-404d-9809-f58444bc6473-logs\") pod \"barbican-api-6955855558-kv2ld\" (UID: \"410af4be-4a66-404d-9809-f58444bc6473\") " pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.350851 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-f46489f4-x24zj" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.350913 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/410af4be-4a66-404d-9809-f58444bc6473-combined-ca-bundle\") pod \"barbican-api-6955855558-kv2ld\" (UID: \"410af4be-4a66-404d-9809-f58444bc6473\") " pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.351155 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/410af4be-4a66-404d-9809-f58444bc6473-config-data-custom\") pod \"barbican-api-6955855558-kv2ld\" (UID: \"410af4be-4a66-404d-9809-f58444bc6473\") " pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.359933 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/410af4be-4a66-404d-9809-f58444bc6473-config-data\") pod \"barbican-api-6955855558-kv2ld\" (UID: \"410af4be-4a66-404d-9809-f58444bc6473\") " pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.360561 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks729\" (UniqueName: \"kubernetes.io/projected/410af4be-4a66-404d-9809-f58444bc6473-kube-api-access-ks729\") pod \"barbican-api-6955855558-kv2ld\" (UID: \"410af4be-4a66-404d-9809-f58444bc6473\") " pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.382780 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.496711 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.592473 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.592518 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.685840 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"185b3c31-7ccc-4f8d-bcb1-20cabbf50943","Type":"ContainerStarted","Data":"eefbbd3f0b520bf32a3f3135f04b4227a82b1cef683b398cd1cff8682da24dc5"} Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.692139 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7b667979-k5l98" podUID="df6e7376-a420-4a04-abf8-ab5bc3f76d7c" containerName="dnsmasq-dns" containerID="cri-o://49f322e2e5a998b03b8c1e6d24a870a2761f6b73101b84a5dad2e245b08bc968" gracePeriod=10 Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.693230 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9cc757857-wng6k" event={"ID":"30df70d3-9323-4ddd-9d1c-2dae72cff6d9","Type":"ContainerStarted","Data":"4066f7167e65b72192b5a8b8761ef6253bac87e59c26264981e57eed910d7b00"} Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.693272 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9cc757857-wng6k" event={"ID":"30df70d3-9323-4ddd-9d1c-2dae72cff6d9","Type":"ContainerStarted","Data":"818e2c29c503b423be035ac1c8c503b52ac8e692015ea1a5cf2a62ba7d1eb249"} Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.693290 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.700308 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.700941 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.730903 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-9cc757857-wng6k" podStartSLOduration=2.730881261 podStartE2EDuration="2.730881261s" podCreationTimestamp="2026-02-17 13:47:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:07.716795718 +0000 UTC m=+1301.828215055" watchObservedRunningTime="2026-02-17 13:47:07.730881261 +0000 UTC m=+1301.842300598" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.761964 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5f97f9545f-tngcj"] Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.790585 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.954228 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-f46489f4-x24zj"] Feb 17 13:47:07 crc kubenswrapper[4804]: I0217 13:47:07.972143 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6d69649784-lnwhw"] Feb 17 13:47:08 crc kubenswrapper[4804]: W0217 13:47:08.013643 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod858d67cb_268b_4724_bba9_a7ab9a10ed6c.slice/crio-aa5075c46c2059bce2617b72ff9095be9a4925386b6156393a536861fa43ead7 WatchSource:0}: Error finding container aa5075c46c2059bce2617b72ff9095be9a4925386b6156393a536861fa43ead7: Status 404 returned error can't find the container with id aa5075c46c2059bce2617b72ff9095be9a4925386b6156393a536861fa43ead7 Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.025727 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-67b49bc6f6-6kg64"] Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.288366 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-cxg2s"] Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.510729 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.622848 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-ovsdbserver-sb\") pod \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.622892 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-dns-swift-storage-0\") pod \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.622949 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-config\") pod \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.622980 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gr4tf\" (UniqueName: \"kubernetes.io/projected/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-kube-api-access-gr4tf\") pod \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.623066 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-ovsdbserver-nb\") pod \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.623120 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-dns-svc\") pod \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\" (UID: \"df6e7376-a420-4a04-abf8-ab5bc3f76d7c\") " Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.684047 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-kube-api-access-gr4tf" (OuterVolumeSpecName: "kube-api-access-gr4tf") pod "df6e7376-a420-4a04-abf8-ab5bc3f76d7c" (UID: "df6e7376-a420-4a04-abf8-ab5bc3f76d7c"). InnerVolumeSpecName "kube-api-access-gr4tf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.743382 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gr4tf\" (UniqueName: \"kubernetes.io/projected/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-kube-api-access-gr4tf\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.817850 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-config" (OuterVolumeSpecName: "config") pod "df6e7376-a420-4a04-abf8-ab5bc3f76d7c" (UID: "df6e7376-a420-4a04-abf8-ab5bc3f76d7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.818109 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "df6e7376-a420-4a04-abf8-ab5bc3f76d7c" (UID: "df6e7376-a420-4a04-abf8-ab5bc3f76d7c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.818356 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "df6e7376-a420-4a04-abf8-ab5bc3f76d7c" (UID: "df6e7376-a420-4a04-abf8-ab5bc3f76d7c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.822047 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "df6e7376-a420-4a04-abf8-ab5bc3f76d7c" (UID: "df6e7376-a420-4a04-abf8-ab5bc3f76d7c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.824732 4804 generic.go:334] "Generic (PLEG): container finished" podID="df6e7376-a420-4a04-abf8-ab5bc3f76d7c" containerID="49f322e2e5a998b03b8c1e6d24a870a2761f6b73101b84a5dad2e245b08bc968" exitCode=0 Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.824774 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-k5l98" Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.845698 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.845953 4804 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.845963 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.845971 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.861655 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "df6e7376-a420-4a04-abf8-ab5bc3f76d7c" (UID: "df6e7376-a420-4a04-abf8-ab5bc3f76d7c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:47:08 crc kubenswrapper[4804]: I0217 13:47:08.950094 4804 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df6e7376-a420-4a04-abf8-ab5bc3f76d7c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.036888 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6955855558-kv2ld"] Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.036952 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.036966 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"185b3c31-7ccc-4f8d-bcb1-20cabbf50943","Type":"ContainerStarted","Data":"953b973c887ab4f0021ac7303d1d54237d7d43957bc03fe8e6222019de4450e9"} Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.036985 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5f97f9545f-tngcj" event={"ID":"c7f4e4c3-9ec8-4923-bf7b-4058899e863f","Type":"ContainerStarted","Data":"6d4b3efff11097530d0823774a5b359f486e2f4a1ee72bdad83ea74ea46124cd"} Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.037017 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d69649784-lnwhw" event={"ID":"858d67cb-268b-4724-bba9-a7ab9a10ed6c","Type":"ContainerStarted","Data":"d142c1aa2575d7b9e84287db646c2533a45b3dabc061a435a7b2335843cef639"} Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.037030 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d69649784-lnwhw" event={"ID":"858d67cb-268b-4724-bba9-a7ab9a10ed6c","Type":"ContainerStarted","Data":"aa5075c46c2059bce2617b72ff9095be9a4925386b6156393a536861fa43ead7"} Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.037038 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-f46489f4-x24zj" event={"ID":"297a0648-3cbd-4f1e-8bc4-d918a702c33b","Type":"ContainerStarted","Data":"cb47a75c1cfc217b32c8ce0b08563ac965de4aa9dba350829c6314a0811ff20a"} Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.037047 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-67b49bc6f6-6kg64" event={"ID":"8c441055-8615-497e-8754-d107b3be24c7","Type":"ContainerStarted","Data":"e068127ef4a35fe1666fe6daae6e3863f5f73c668d7a73942cfafc2c05e0acec"} Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.037058 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-67b49bc6f6-6kg64" event={"ID":"8c441055-8615-497e-8754-d107b3be24c7","Type":"ContainerStarted","Data":"ba0cc2230bff6e65b06b28d38b4ed605a390c7cff5692c7c90bbd33cf0934ef3"} Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.037066 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-k5l98" event={"ID":"df6e7376-a420-4a04-abf8-ab5bc3f76d7c","Type":"ContainerDied","Data":"49f322e2e5a998b03b8c1e6d24a870a2761f6b73101b84a5dad2e245b08bc968"} Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.037101 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-k5l98" event={"ID":"df6e7376-a420-4a04-abf8-ab5bc3f76d7c","Type":"ContainerDied","Data":"acc1d16ca31ae16b95fd7513bacd065031f5a80799a0b49cb8f97e1864a0396a"} Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.037111 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" event={"ID":"b85d5058-b075-42ca-8d69-a86cfc1bd01c","Type":"ContainerStarted","Data":"1ceb04ce2633cdf168f7ec2c7223a7b5436da513a4112c0b4cec53ae79c55d6e"} Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.037130 4804 scope.go:117] "RemoveContainer" containerID="49f322e2e5a998b03b8c1e6d24a870a2761f6b73101b84a5dad2e245b08bc968" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.132755 4804 scope.go:117] "RemoveContainer" containerID="d94c4e75462da55817e78fa331ff2dfa0d9f3a19457e74a4ec47a997d6a72b47" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.192815 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-k5l98"] Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.213648 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-k5l98"] Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.219715 4804 scope.go:117] "RemoveContainer" containerID="49f322e2e5a998b03b8c1e6d24a870a2761f6b73101b84a5dad2e245b08bc968" Feb 17 13:47:09 crc kubenswrapper[4804]: E0217 13:47:09.221490 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49f322e2e5a998b03b8c1e6d24a870a2761f6b73101b84a5dad2e245b08bc968\": container with ID starting with 49f322e2e5a998b03b8c1e6d24a870a2761f6b73101b84a5dad2e245b08bc968 not found: ID does not exist" containerID="49f322e2e5a998b03b8c1e6d24a870a2761f6b73101b84a5dad2e245b08bc968" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.221546 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49f322e2e5a998b03b8c1e6d24a870a2761f6b73101b84a5dad2e245b08bc968"} err="failed to get container status \"49f322e2e5a998b03b8c1e6d24a870a2761f6b73101b84a5dad2e245b08bc968\": rpc error: code = NotFound desc = could not find container \"49f322e2e5a998b03b8c1e6d24a870a2761f6b73101b84a5dad2e245b08bc968\": container with ID starting with 49f322e2e5a998b03b8c1e6d24a870a2761f6b73101b84a5dad2e245b08bc968 not found: ID does not exist" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.221569 4804 scope.go:117] "RemoveContainer" containerID="d94c4e75462da55817e78fa331ff2dfa0d9f3a19457e74a4ec47a997d6a72b47" Feb 17 13:47:09 crc kubenswrapper[4804]: E0217 13:47:09.222325 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d94c4e75462da55817e78fa331ff2dfa0d9f3a19457e74a4ec47a997d6a72b47\": container with ID starting with d94c4e75462da55817e78fa331ff2dfa0d9f3a19457e74a4ec47a997d6a72b47 not found: ID does not exist" containerID="d94c4e75462da55817e78fa331ff2dfa0d9f3a19457e74a4ec47a997d6a72b47" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.222377 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d94c4e75462da55817e78fa331ff2dfa0d9f3a19457e74a4ec47a997d6a72b47"} err="failed to get container status \"d94c4e75462da55817e78fa331ff2dfa0d9f3a19457e74a4ec47a997d6a72b47\": rpc error: code = NotFound desc = could not find container \"d94c4e75462da55817e78fa331ff2dfa0d9f3a19457e74a4ec47a997d6a72b47\": container with ID starting with d94c4e75462da55817e78fa331ff2dfa0d9f3a19457e74a4ec47a997d6a72b47 not found: ID does not exist" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.846791 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6955855558-kv2ld" event={"ID":"410af4be-4a66-404d-9809-f58444bc6473","Type":"ContainerStarted","Data":"7797ca6ae3158f51032a378dceeadfa4a4aab48a558972be10499d12d6917e06"} Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.847145 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6955855558-kv2ld" event={"ID":"410af4be-4a66-404d-9809-f58444bc6473","Type":"ContainerStarted","Data":"3225329e6ab93eac20c2d9227e1f3df46c5bdbdd2affe08906fba20733bf989a"} Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.847157 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6955855558-kv2ld" event={"ID":"410af4be-4a66-404d-9809-f58444bc6473","Type":"ContainerStarted","Data":"0345398b72b1e224ae72d70254cce0d2edce23c11bae1237519d2ed6af3adbe1"} Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.851460 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.851495 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.858032 4804 generic.go:334] "Generic (PLEG): container finished" podID="b85d5058-b075-42ca-8d69-a86cfc1bd01c" containerID="61ec7f44479bb9daa4c9c91948a35a994fd304cc644764be5a4bbb119a672347" exitCode=0 Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.858359 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" event={"ID":"b85d5058-b075-42ca-8d69-a86cfc1bd01c","Type":"ContainerStarted","Data":"d980c32d83966a44bf55958cd6329cf2bd80a3344dee8dab3f8264dc4b275f31"} Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.858380 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" event={"ID":"b85d5058-b075-42ca-8d69-a86cfc1bd01c","Type":"ContainerDied","Data":"61ec7f44479bb9daa4c9c91948a35a994fd304cc644764be5a4bbb119a672347"} Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.859029 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.869135 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"185b3c31-7ccc-4f8d-bcb1-20cabbf50943","Type":"ContainerStarted","Data":"b16be4415372b664ded381a846b14c2c2406261a1bf459398d286a236434a0be"} Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.875023 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6955855558-kv2ld" podStartSLOduration=3.874999379 podStartE2EDuration="3.874999379s" podCreationTimestamp="2026-02-17 13:47:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:09.867991619 +0000 UTC m=+1303.979410946" watchObservedRunningTime="2026-02-17 13:47:09.874999379 +0000 UTC m=+1303.986418716" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.883665 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d69649784-lnwhw" event={"ID":"858d67cb-268b-4724-bba9-a7ab9a10ed6c","Type":"ContainerStarted","Data":"1718bb4d39cef70818fc83641992541a17ca3fafedb93bd6235c83058144fd75"} Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.884567 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.914153 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-67b49bc6f6-6kg64" event={"ID":"8c441055-8615-497e-8754-d107b3be24c7","Type":"ContainerStarted","Data":"a7e8ab1ab077b94304088259748bb06ecfc3c7584c12f7f6a0f542683df40b3c"} Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.914243 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.914319 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.916565 4804 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.919940 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" podStartSLOduration=3.919922781 podStartE2EDuration="3.919922781s" podCreationTimestamp="2026-02-17 13:47:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:09.890736014 +0000 UTC m=+1304.002155361" watchObservedRunningTime="2026-02-17 13:47:09.919922781 +0000 UTC m=+1304.031342118" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.930692 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6d69649784-lnwhw" podStartSLOduration=3.930672989 podStartE2EDuration="3.930672989s" podCreationTimestamp="2026-02-17 13:47:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:09.930028729 +0000 UTC m=+1304.041448066" watchObservedRunningTime="2026-02-17 13:47:09.930672989 +0000 UTC m=+1304.042092326" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.957772 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.9577567 podStartE2EDuration="4.9577567s" podCreationTimestamp="2026-02-17 13:47:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:09.949506411 +0000 UTC m=+1304.060925748" watchObservedRunningTime="2026-02-17 13:47:09.9577567 +0000 UTC m=+1304.069176027" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.991244 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7cc7c97fdd-bhd7w"] Feb 17 13:47:09 crc kubenswrapper[4804]: E0217 13:47:09.991669 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df6e7376-a420-4a04-abf8-ab5bc3f76d7c" containerName="dnsmasq-dns" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.991680 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="df6e7376-a420-4a04-abf8-ab5bc3f76d7c" containerName="dnsmasq-dns" Feb 17 13:47:09 crc kubenswrapper[4804]: E0217 13:47:09.991695 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df6e7376-a420-4a04-abf8-ab5bc3f76d7c" containerName="init" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.991701 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="df6e7376-a420-4a04-abf8-ab5bc3f76d7c" containerName="init" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.991899 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="df6e7376-a420-4a04-abf8-ab5bc3f76d7c" containerName="dnsmasq-dns" Feb 17 13:47:09 crc kubenswrapper[4804]: I0217 13:47:09.992842 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.007817 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.008070 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.008671 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-67b49bc6f6-6kg64" podStartSLOduration=5.00865277 podStartE2EDuration="5.00865277s" podCreationTimestamp="2026-02-17 13:47:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:09.980667301 +0000 UTC m=+1304.092086638" watchObservedRunningTime="2026-02-17 13:47:10.00865277 +0000 UTC m=+1304.120072107" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.049945 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7cc7c97fdd-bhd7w"] Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.073994 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b89da32-9537-4c7b-a266-0d38ac52b069-logs\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.074048 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2b89da32-9537-4c7b-a266-0d38ac52b069-config-data-custom\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.076449 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b89da32-9537-4c7b-a266-0d38ac52b069-config-data\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.076594 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b89da32-9537-4c7b-a266-0d38ac52b069-public-tls-certs\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.076649 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h6vr\" (UniqueName: \"kubernetes.io/projected/2b89da32-9537-4c7b-a266-0d38ac52b069-kube-api-access-8h6vr\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.076904 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b89da32-9537-4c7b-a266-0d38ac52b069-internal-tls-certs\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.077009 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b89da32-9537-4c7b-a266-0d38ac52b069-combined-ca-bundle\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.182120 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b89da32-9537-4c7b-a266-0d38ac52b069-logs\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.182405 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2b89da32-9537-4c7b-a266-0d38ac52b069-config-data-custom\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.182513 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b89da32-9537-4c7b-a266-0d38ac52b069-config-data\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.182599 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b89da32-9537-4c7b-a266-0d38ac52b069-public-tls-certs\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.182675 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h6vr\" (UniqueName: \"kubernetes.io/projected/2b89da32-9537-4c7b-a266-0d38ac52b069-kube-api-access-8h6vr\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.182762 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b89da32-9537-4c7b-a266-0d38ac52b069-internal-tls-certs\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.182825 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b89da32-9537-4c7b-a266-0d38ac52b069-combined-ca-bundle\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.184893 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b89da32-9537-4c7b-a266-0d38ac52b069-logs\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.199762 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b89da32-9537-4c7b-a266-0d38ac52b069-combined-ca-bundle\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.199804 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b89da32-9537-4c7b-a266-0d38ac52b069-public-tls-certs\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.203455 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b89da32-9537-4c7b-a266-0d38ac52b069-internal-tls-certs\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.203586 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2b89da32-9537-4c7b-a266-0d38ac52b069-config-data-custom\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.205329 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b89da32-9537-4c7b-a266-0d38ac52b069-config-data\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.224953 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h6vr\" (UniqueName: \"kubernetes.io/projected/2b89da32-9537-4c7b-a266-0d38ac52b069-kube-api-access-8h6vr\") pod \"barbican-api-7cc7c97fdd-bhd7w\" (UID: \"2b89da32-9537-4c7b-a266-0d38ac52b069\") " pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.334873 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.605422 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df6e7376-a420-4a04-abf8-ab5bc3f76d7c" path="/var/lib/kubelet/pods/df6e7376-a420-4a04-abf8-ab5bc3f76d7c/volumes" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.757864 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.759741 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.863710 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7cc7c97fdd-bhd7w"] Feb 17 13:47:10 crc kubenswrapper[4804]: I0217 13:47:10.940751 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:11 crc kubenswrapper[4804]: I0217 13:47:11.951139 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cc7c97fdd-bhd7w" event={"ID":"2b89da32-9537-4c7b-a266-0d38ac52b069","Type":"ContainerStarted","Data":"c8ca657f6236874f1b16afcede029cb285c5198e2af52a99411a433e516ce462"} Feb 17 13:47:12 crc kubenswrapper[4804]: I0217 13:47:12.963610 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-f46489f4-x24zj" event={"ID":"297a0648-3cbd-4f1e-8bc4-d918a702c33b","Type":"ContainerStarted","Data":"3328fd468939de33640af98a7f6863a2bddfa10ef51851ae30229e4ba4d549bf"} Feb 17 13:47:12 crc kubenswrapper[4804]: I0217 13:47:12.963976 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-f46489f4-x24zj" event={"ID":"297a0648-3cbd-4f1e-8bc4-d918a702c33b","Type":"ContainerStarted","Data":"fd1c4f6788f3648bec6c2023e257e90bce62947fdd08eff0aa5554c1655bb985"} Feb 17 13:47:12 crc kubenswrapper[4804]: I0217 13:47:12.966298 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5f97f9545f-tngcj" event={"ID":"c7f4e4c3-9ec8-4923-bf7b-4058899e863f","Type":"ContainerStarted","Data":"69d1f41d1a7f6153e6465f83519516ab918b4bcb906ffb3bc452ca225cddd7da"} Feb 17 13:47:12 crc kubenswrapper[4804]: I0217 13:47:12.968055 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cc7c97fdd-bhd7w" event={"ID":"2b89da32-9537-4c7b-a266-0d38ac52b069","Type":"ContainerStarted","Data":"bbb93f0a32531c05e1658bd58d304296c70a556976494d7b1c76d834a1ac7d52"} Feb 17 13:47:12 crc kubenswrapper[4804]: I0217 13:47:12.970352 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-f9zkj" event={"ID":"02a921c8-6579-451b-beaf-9832cf900668","Type":"ContainerStarted","Data":"2b0f9e8901b98239ec002ee748081354dc9e4f43d7161d56dae423af6c1770d2"} Feb 17 13:47:12 crc kubenswrapper[4804]: I0217 13:47:12.987669 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-f46489f4-x24zj" podStartSLOduration=3.300717027 podStartE2EDuration="6.98765027s" podCreationTimestamp="2026-02-17 13:47:06 +0000 UTC" firstStartedPulling="2026-02-17 13:47:07.959842659 +0000 UTC m=+1302.071261996" lastFinishedPulling="2026-02-17 13:47:11.646775902 +0000 UTC m=+1305.758195239" observedRunningTime="2026-02-17 13:47:12.97969222 +0000 UTC m=+1307.091111557" watchObservedRunningTime="2026-02-17 13:47:12.98765027 +0000 UTC m=+1307.099069607" Feb 17 13:47:13 crc kubenswrapper[4804]: I0217 13:47:13.000114 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-f9zkj" podStartSLOduration=2.790507695 podStartE2EDuration="48.000090331s" podCreationTimestamp="2026-02-17 13:46:25 +0000 UTC" firstStartedPulling="2026-02-17 13:46:26.457246526 +0000 UTC m=+1260.568665863" lastFinishedPulling="2026-02-17 13:47:11.666829162 +0000 UTC m=+1305.778248499" observedRunningTime="2026-02-17 13:47:12.998054067 +0000 UTC m=+1307.109473404" watchObservedRunningTime="2026-02-17 13:47:13.000090331 +0000 UTC m=+1307.111509678" Feb 17 13:47:14 crc kubenswrapper[4804]: I0217 13:47:14.870507 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:47:14 crc kubenswrapper[4804]: I0217 13:47:14.872031 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:47:14 crc kubenswrapper[4804]: I0217 13:47:14.882677 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-9ffb6f5c6-fczv5" podUID="e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Feb 17 13:47:16 crc kubenswrapper[4804]: I0217 13:47:16.090024 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 17 13:47:16 crc kubenswrapper[4804]: I0217 13:47:16.090821 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 17 13:47:16 crc kubenswrapper[4804]: I0217 13:47:16.116543 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 17 13:47:16 crc kubenswrapper[4804]: I0217 13:47:16.127878 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 17 13:47:17 crc kubenswrapper[4804]: I0217 13:47:17.020872 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5f97f9545f-tngcj" event={"ID":"c7f4e4c3-9ec8-4923-bf7b-4058899e863f","Type":"ContainerStarted","Data":"53928c720a0f168a47cb2e57fd13c48b24857ae2a6615ea52cf04473b07d7cd1"} Feb 17 13:47:17 crc kubenswrapper[4804]: I0217 13:47:17.024959 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cc7c97fdd-bhd7w" event={"ID":"2b89da32-9537-4c7b-a266-0d38ac52b069","Type":"ContainerStarted","Data":"1e0154e2eb8f5cf5b5fb7a8b17bfce3481ede2d328565aaa84ed6a947e22e95b"} Feb 17 13:47:17 crc kubenswrapper[4804]: I0217 13:47:17.024995 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 17 13:47:17 crc kubenswrapper[4804]: I0217 13:47:17.025008 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:17 crc kubenswrapper[4804]: I0217 13:47:17.025016 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:17 crc kubenswrapper[4804]: I0217 13:47:17.025026 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 17 13:47:17 crc kubenswrapper[4804]: I0217 13:47:17.042776 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5f97f9545f-tngcj" podStartSLOduration=7.236516192 podStartE2EDuration="11.042759515s" podCreationTimestamp="2026-02-17 13:47:06 +0000 UTC" firstStartedPulling="2026-02-17 13:47:07.842246242 +0000 UTC m=+1301.953665579" lastFinishedPulling="2026-02-17 13:47:11.648489555 +0000 UTC m=+1305.759908902" observedRunningTime="2026-02-17 13:47:17.034178496 +0000 UTC m=+1311.145597833" watchObservedRunningTime="2026-02-17 13:47:17.042759515 +0000 UTC m=+1311.154178852" Feb 17 13:47:17 crc kubenswrapper[4804]: I0217 13:47:17.082243 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7cc7c97fdd-bhd7w" podStartSLOduration=8.082221876 podStartE2EDuration="8.082221876s" podCreationTimestamp="2026-02-17 13:47:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:17.060102931 +0000 UTC m=+1311.171522278" watchObservedRunningTime="2026-02-17 13:47:17.082221876 +0000 UTC m=+1311.193641203" Feb 17 13:47:17 crc kubenswrapper[4804]: E0217 13:47:17.193787 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="e5ccd477-88cd-4284-9de7-f336def1c7a1" Feb 17 13:47:17 crc kubenswrapper[4804]: I0217 13:47:17.384374 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:17 crc kubenswrapper[4804]: I0217 13:47:17.502740 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-mtwfj"] Feb 17 13:47:17 crc kubenswrapper[4804]: I0217 13:47:17.502990 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" podUID="1fa3f342-a062-421d-8c06-f53468a8db00" containerName="dnsmasq-dns" containerID="cri-o://b9a3f395e90e39b7c24df35dd6e3f0dd7e4bcbc43cd3d4f5483755287749ca41" gracePeriod=10 Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.054173 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5ccd477-88cd-4284-9de7-f336def1c7a1","Type":"ContainerStarted","Data":"2177d96b7ccc69abb21c72197299cdd7db75d5cbd8d109344470068cf476eeb5"} Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.054880 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e5ccd477-88cd-4284-9de7-f336def1c7a1" containerName="ceilometer-notification-agent" containerID="cri-o://c4388959eec5a5e0e8de102bea2b09564ba60fc2953cdfd791f11ae067628b67" gracePeriod=30 Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.055049 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.055173 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e5ccd477-88cd-4284-9de7-f336def1c7a1" containerName="proxy-httpd" containerID="cri-o://2177d96b7ccc69abb21c72197299cdd7db75d5cbd8d109344470068cf476eeb5" gracePeriod=30 Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.055245 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e5ccd477-88cd-4284-9de7-f336def1c7a1" containerName="sg-core" containerID="cri-o://c40d28d17d3f2c54c12b4cb0de4103cece798c21dc1c74faef6b897bc022c5c5" gracePeriod=30 Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.089462 4804 generic.go:334] "Generic (PLEG): container finished" podID="1fa3f342-a062-421d-8c06-f53468a8db00" containerID="b9a3f395e90e39b7c24df35dd6e3f0dd7e4bcbc43cd3d4f5483755287749ca41" exitCode=0 Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.089649 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" event={"ID":"1fa3f342-a062-421d-8c06-f53468a8db00","Type":"ContainerDied","Data":"b9a3f395e90e39b7c24df35dd6e3f0dd7e4bcbc43cd3d4f5483755287749ca41"} Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.163238 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.249683 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-dns-swift-storage-0\") pod \"1fa3f342-a062-421d-8c06-f53468a8db00\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.249802 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-dns-svc\") pod \"1fa3f342-a062-421d-8c06-f53468a8db00\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.249883 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-config\") pod \"1fa3f342-a062-421d-8c06-f53468a8db00\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.249908 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bfbg\" (UniqueName: \"kubernetes.io/projected/1fa3f342-a062-421d-8c06-f53468a8db00-kube-api-access-9bfbg\") pod \"1fa3f342-a062-421d-8c06-f53468a8db00\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.249956 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-ovsdbserver-sb\") pod \"1fa3f342-a062-421d-8c06-f53468a8db00\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.250045 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-ovsdbserver-nb\") pod \"1fa3f342-a062-421d-8c06-f53468a8db00\" (UID: \"1fa3f342-a062-421d-8c06-f53468a8db00\") " Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.256348 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fa3f342-a062-421d-8c06-f53468a8db00-kube-api-access-9bfbg" (OuterVolumeSpecName: "kube-api-access-9bfbg") pod "1fa3f342-a062-421d-8c06-f53468a8db00" (UID: "1fa3f342-a062-421d-8c06-f53468a8db00"). InnerVolumeSpecName "kube-api-access-9bfbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.313952 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1fa3f342-a062-421d-8c06-f53468a8db00" (UID: "1fa3f342-a062-421d-8c06-f53468a8db00"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.329686 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1fa3f342-a062-421d-8c06-f53468a8db00" (UID: "1fa3f342-a062-421d-8c06-f53468a8db00"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.329716 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1fa3f342-a062-421d-8c06-f53468a8db00" (UID: "1fa3f342-a062-421d-8c06-f53468a8db00"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.338136 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1fa3f342-a062-421d-8c06-f53468a8db00" (UID: "1fa3f342-a062-421d-8c06-f53468a8db00"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.353322 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.353431 4804 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.353448 4804 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.353460 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bfbg\" (UniqueName: \"kubernetes.io/projected/1fa3f342-a062-421d-8c06-f53468a8db00-kube-api-access-9bfbg\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.353474 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.360661 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-config" (OuterVolumeSpecName: "config") pod "1fa3f342-a062-421d-8c06-f53468a8db00" (UID: "1fa3f342-a062-421d-8c06-f53468a8db00"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.454502 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa3f342-a062-421d-8c06-f53468a8db00-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:18 crc kubenswrapper[4804]: I0217 13:47:18.790717 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:19 crc kubenswrapper[4804]: I0217 13:47:19.098599 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" event={"ID":"1fa3f342-a062-421d-8c06-f53468a8db00","Type":"ContainerDied","Data":"cdb7f4453bccc68342ae31db3bbfc987aaa5d47b283d10e4a4bd0daebe7bbf50"} Feb 17 13:47:19 crc kubenswrapper[4804]: I0217 13:47:19.098635 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-mtwfj" Feb 17 13:47:19 crc kubenswrapper[4804]: I0217 13:47:19.098653 4804 scope.go:117] "RemoveContainer" containerID="b9a3f395e90e39b7c24df35dd6e3f0dd7e4bcbc43cd3d4f5483755287749ca41" Feb 17 13:47:19 crc kubenswrapper[4804]: I0217 13:47:19.104973 4804 generic.go:334] "Generic (PLEG): container finished" podID="e5ccd477-88cd-4284-9de7-f336def1c7a1" containerID="2177d96b7ccc69abb21c72197299cdd7db75d5cbd8d109344470068cf476eeb5" exitCode=0 Feb 17 13:47:19 crc kubenswrapper[4804]: I0217 13:47:19.105008 4804 generic.go:334] "Generic (PLEG): container finished" podID="e5ccd477-88cd-4284-9de7-f336def1c7a1" containerID="c40d28d17d3f2c54c12b4cb0de4103cece798c21dc1c74faef6b897bc022c5c5" exitCode=2 Feb 17 13:47:19 crc kubenswrapper[4804]: I0217 13:47:19.105050 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5ccd477-88cd-4284-9de7-f336def1c7a1","Type":"ContainerDied","Data":"2177d96b7ccc69abb21c72197299cdd7db75d5cbd8d109344470068cf476eeb5"} Feb 17 13:47:19 crc kubenswrapper[4804]: I0217 13:47:19.105095 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5ccd477-88cd-4284-9de7-f336def1c7a1","Type":"ContainerDied","Data":"c40d28d17d3f2c54c12b4cb0de4103cece798c21dc1c74faef6b897bc022c5c5"} Feb 17 13:47:19 crc kubenswrapper[4804]: I0217 13:47:19.109824 4804 generic.go:334] "Generic (PLEG): container finished" podID="02a921c8-6579-451b-beaf-9832cf900668" containerID="2b0f9e8901b98239ec002ee748081354dc9e4f43d7161d56dae423af6c1770d2" exitCode=0 Feb 17 13:47:19 crc kubenswrapper[4804]: I0217 13:47:19.109902 4804 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 13:47:19 crc kubenswrapper[4804]: I0217 13:47:19.109913 4804 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 13:47:19 crc kubenswrapper[4804]: I0217 13:47:19.110110 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-f9zkj" event={"ID":"02a921c8-6579-451b-beaf-9832cf900668","Type":"ContainerDied","Data":"2b0f9e8901b98239ec002ee748081354dc9e4f43d7161d56dae423af6c1770d2"} Feb 17 13:47:19 crc kubenswrapper[4804]: I0217 13:47:19.138423 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-mtwfj"] Feb 17 13:47:19 crc kubenswrapper[4804]: I0217 13:47:19.148014 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-mtwfj"] Feb 17 13:47:19 crc kubenswrapper[4804]: I0217 13:47:19.272497 4804 scope.go:117] "RemoveContainer" containerID="63be9f06e01e3909b7ff94ea9b177c0a528139e2942719322a381a426d4f2574" Feb 17 13:47:19 crc kubenswrapper[4804]: I0217 13:47:19.527879 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:19 crc kubenswrapper[4804]: I0217 13:47:19.614757 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:19 crc kubenswrapper[4804]: I0217 13:47:19.745503 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 17 13:47:19 crc kubenswrapper[4804]: I0217 13:47:19.752818 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 17 13:47:20 crc kubenswrapper[4804]: I0217 13:47:20.501276 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:47:20 crc kubenswrapper[4804]: I0217 13:47:20.584329 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fa3f342-a062-421d-8c06-f53468a8db00" path="/var/lib/kubelet/pods/1fa3f342-a062-421d-8c06-f53468a8db00/volumes" Feb 17 13:47:20 crc kubenswrapper[4804]: I0217 13:47:20.698768 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-combined-ca-bundle\") pod \"02a921c8-6579-451b-beaf-9832cf900668\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " Feb 17 13:47:20 crc kubenswrapper[4804]: I0217 13:47:20.699145 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/02a921c8-6579-451b-beaf-9832cf900668-etc-machine-id\") pod \"02a921c8-6579-451b-beaf-9832cf900668\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " Feb 17 13:47:20 crc kubenswrapper[4804]: I0217 13:47:20.699178 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-scripts\") pod \"02a921c8-6579-451b-beaf-9832cf900668\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " Feb 17 13:47:20 crc kubenswrapper[4804]: I0217 13:47:20.699241 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02a921c8-6579-451b-beaf-9832cf900668-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "02a921c8-6579-451b-beaf-9832cf900668" (UID: "02a921c8-6579-451b-beaf-9832cf900668"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:47:20 crc kubenswrapper[4804]: I0217 13:47:20.699307 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-db-sync-config-data\") pod \"02a921c8-6579-451b-beaf-9832cf900668\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " Feb 17 13:47:20 crc kubenswrapper[4804]: I0217 13:47:20.699392 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-config-data\") pod \"02a921c8-6579-451b-beaf-9832cf900668\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " Feb 17 13:47:20 crc kubenswrapper[4804]: I0217 13:47:20.699492 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trmx2\" (UniqueName: \"kubernetes.io/projected/02a921c8-6579-451b-beaf-9832cf900668-kube-api-access-trmx2\") pod \"02a921c8-6579-451b-beaf-9832cf900668\" (UID: \"02a921c8-6579-451b-beaf-9832cf900668\") " Feb 17 13:47:20 crc kubenswrapper[4804]: I0217 13:47:20.700192 4804 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/02a921c8-6579-451b-beaf-9832cf900668-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:20 crc kubenswrapper[4804]: I0217 13:47:20.708384 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "02a921c8-6579-451b-beaf-9832cf900668" (UID: "02a921c8-6579-451b-beaf-9832cf900668"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:20 crc kubenswrapper[4804]: I0217 13:47:20.712552 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02a921c8-6579-451b-beaf-9832cf900668-kube-api-access-trmx2" (OuterVolumeSpecName: "kube-api-access-trmx2") pod "02a921c8-6579-451b-beaf-9832cf900668" (UID: "02a921c8-6579-451b-beaf-9832cf900668"). InnerVolumeSpecName "kube-api-access-trmx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:47:20 crc kubenswrapper[4804]: I0217 13:47:20.716300 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-scripts" (OuterVolumeSpecName: "scripts") pod "02a921c8-6579-451b-beaf-9832cf900668" (UID: "02a921c8-6579-451b-beaf-9832cf900668"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:20 crc kubenswrapper[4804]: I0217 13:47:20.762021 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02a921c8-6579-451b-beaf-9832cf900668" (UID: "02a921c8-6579-451b-beaf-9832cf900668"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:20 crc kubenswrapper[4804]: I0217 13:47:20.762445 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-config-data" (OuterVolumeSpecName: "config-data") pod "02a921c8-6579-451b-beaf-9832cf900668" (UID: "02a921c8-6579-451b-beaf-9832cf900668"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:20 crc kubenswrapper[4804]: I0217 13:47:20.801786 4804 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:20 crc kubenswrapper[4804]: I0217 13:47:20.801819 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:20 crc kubenswrapper[4804]: I0217 13:47:20.801830 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trmx2\" (UniqueName: \"kubernetes.io/projected/02a921c8-6579-451b-beaf-9832cf900668-kube-api-access-trmx2\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:20 crc kubenswrapper[4804]: I0217 13:47:20.801841 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:20 crc kubenswrapper[4804]: I0217 13:47:20.801853 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02a921c8-6579-451b-beaf-9832cf900668-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.130820 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-f9zkj" event={"ID":"02a921c8-6579-451b-beaf-9832cf900668","Type":"ContainerDied","Data":"ef91adb98631667de4f24978b6722ac67d6e2cf414b43820776723b677addd2c"} Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.130847 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-f9zkj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.130864 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef91adb98631667de4f24978b6722ac67d6e2cf414b43820776723b677addd2c" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.443496 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 13:47:21 crc kubenswrapper[4804]: E0217 13:47:21.447013 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa3f342-a062-421d-8c06-f53468a8db00" containerName="init" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.447041 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa3f342-a062-421d-8c06-f53468a8db00" containerName="init" Feb 17 13:47:21 crc kubenswrapper[4804]: E0217 13:47:21.447081 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02a921c8-6579-451b-beaf-9832cf900668" containerName="cinder-db-sync" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.447090 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a921c8-6579-451b-beaf-9832cf900668" containerName="cinder-db-sync" Feb 17 13:47:21 crc kubenswrapper[4804]: E0217 13:47:21.447102 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa3f342-a062-421d-8c06-f53468a8db00" containerName="dnsmasq-dns" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.447108 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa3f342-a062-421d-8c06-f53468a8db00" containerName="dnsmasq-dns" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.447327 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fa3f342-a062-421d-8c06-f53468a8db00" containerName="dnsmasq-dns" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.447348 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="02a921c8-6579-451b-beaf-9832cf900668" containerName="cinder-db-sync" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.448260 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.453049 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.453117 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.453398 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-r5hqb" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.453511 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.457377 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.518249 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-mtrxj"] Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.519946 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.548133 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-mtrxj"] Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.623629 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-dns-svc\") pod \"dnsmasq-dns-6578955fd5-mtrxj\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.623708 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-mtrxj\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.623736 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4p6n\" (UniqueName: \"kubernetes.io/projected/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-kube-api-access-t4p6n\") pod \"cinder-scheduler-0\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.623773 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-mtrxj\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.623826 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-config\") pod \"dnsmasq-dns-6578955fd5-mtrxj\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.623866 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.623892 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.623915 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-scripts\") pod \"cinder-scheduler-0\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.623939 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9ztf\" (UniqueName: \"kubernetes.io/projected/737ac1d8-ad22-4a56-b203-eb2212949fb6-kube-api-access-x9ztf\") pod \"dnsmasq-dns-6578955fd5-mtrxj\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.623959 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.624019 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-config-data\") pod \"cinder-scheduler-0\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.624087 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-mtrxj\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.725956 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-dns-svc\") pod \"dnsmasq-dns-6578955fd5-mtrxj\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.726035 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-mtrxj\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.726064 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4p6n\" (UniqueName: \"kubernetes.io/projected/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-kube-api-access-t4p6n\") pod \"cinder-scheduler-0\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.726101 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-mtrxj\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.726150 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-config\") pod \"dnsmasq-dns-6578955fd5-mtrxj\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.726191 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.726231 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.726257 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-scripts\") pod \"cinder-scheduler-0\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.726280 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9ztf\" (UniqueName: \"kubernetes.io/projected/737ac1d8-ad22-4a56-b203-eb2212949fb6-kube-api-access-x9ztf\") pod \"dnsmasq-dns-6578955fd5-mtrxj\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.726309 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.726366 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-config-data\") pod \"cinder-scheduler-0\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.726422 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-mtrxj\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.726852 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-dns-svc\") pod \"dnsmasq-dns-6578955fd5-mtrxj\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.727410 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-mtrxj\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.727472 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.727857 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-mtrxj\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.729010 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-mtrxj\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.729965 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-config\") pod \"dnsmasq-dns-6578955fd5-mtrxj\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.733997 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.735615 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-scripts\") pod \"cinder-scheduler-0\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.735984 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-config-data\") pod \"cinder-scheduler-0\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.739719 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.751482 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4p6n\" (UniqueName: \"kubernetes.io/projected/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-kube-api-access-t4p6n\") pod \"cinder-scheduler-0\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.758814 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9ztf\" (UniqueName: \"kubernetes.io/projected/737ac1d8-ad22-4a56-b203-eb2212949fb6-kube-api-access-x9ztf\") pod \"dnsmasq-dns-6578955fd5-mtrxj\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.758872 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.760347 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.762963 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.774807 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.777525 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.864738 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.930160 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfrmd\" (UniqueName: \"kubernetes.io/projected/0586d6d2-92ba-4c34-9153-3de3fe22add2-kube-api-access-wfrmd\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.930219 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-config-data\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.930251 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0586d6d2-92ba-4c34-9153-3de3fe22add2-logs\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.930266 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0586d6d2-92ba-4c34-9153-3de3fe22add2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.930317 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-scripts\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.930335 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:21 crc kubenswrapper[4804]: I0217 13:47:21.930392 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-config-data-custom\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:22 crc kubenswrapper[4804]: I0217 13:47:22.032065 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-scripts\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:22 crc kubenswrapper[4804]: I0217 13:47:22.032420 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:22 crc kubenswrapper[4804]: I0217 13:47:22.032505 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-config-data-custom\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:22 crc kubenswrapper[4804]: I0217 13:47:22.032654 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfrmd\" (UniqueName: \"kubernetes.io/projected/0586d6d2-92ba-4c34-9153-3de3fe22add2-kube-api-access-wfrmd\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:22 crc kubenswrapper[4804]: I0217 13:47:22.032702 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-config-data\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:22 crc kubenswrapper[4804]: I0217 13:47:22.032740 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0586d6d2-92ba-4c34-9153-3de3fe22add2-logs\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:22 crc kubenswrapper[4804]: I0217 13:47:22.032761 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0586d6d2-92ba-4c34-9153-3de3fe22add2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:22 crc kubenswrapper[4804]: I0217 13:47:22.032879 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0586d6d2-92ba-4c34-9153-3de3fe22add2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:22 crc kubenswrapper[4804]: I0217 13:47:22.039758 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-scripts\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:22 crc kubenswrapper[4804]: I0217 13:47:22.043283 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0586d6d2-92ba-4c34-9153-3de3fe22add2-logs\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:22 crc kubenswrapper[4804]: I0217 13:47:22.043830 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:22 crc kubenswrapper[4804]: I0217 13:47:22.051104 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-config-data-custom\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:22 crc kubenswrapper[4804]: I0217 13:47:22.055445 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-config-data\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:22 crc kubenswrapper[4804]: I0217 13:47:22.103480 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfrmd\" (UniqueName: \"kubernetes.io/projected/0586d6d2-92ba-4c34-9153-3de3fe22add2-kube-api-access-wfrmd\") pod \"cinder-api-0\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " pod="openstack/cinder-api-0" Feb 17 13:47:22 crc kubenswrapper[4804]: I0217 13:47:22.215965 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 13:47:22 crc kubenswrapper[4804]: I0217 13:47:22.233830 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 13:47:22 crc kubenswrapper[4804]: I0217 13:47:22.625171 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-mtrxj"] Feb 17 13:47:22 crc kubenswrapper[4804]: I0217 13:47:22.799115 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.176721 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.189138 4804 generic.go:334] "Generic (PLEG): container finished" podID="737ac1d8-ad22-4a56-b203-eb2212949fb6" containerID="694f46d3a17de98217359c48adef8a435392ec51f1c7919012624d49fd6b0165" exitCode=0 Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.189231 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" event={"ID":"737ac1d8-ad22-4a56-b203-eb2212949fb6","Type":"ContainerDied","Data":"694f46d3a17de98217359c48adef8a435392ec51f1c7919012624d49fd6b0165"} Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.189264 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" event={"ID":"737ac1d8-ad22-4a56-b203-eb2212949fb6","Type":"ContainerStarted","Data":"2d996d992d2a3254b879bd96b12e636e65525644b7181f7f3f61897c257c69b0"} Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.194979 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0586d6d2-92ba-4c34-9153-3de3fe22add2","Type":"ContainerStarted","Data":"3fb4a87b299d4238d785f3f50f31c965eb4578228964e6ab3bbb8b0fd289c1da"} Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.207074 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6","Type":"ContainerStarted","Data":"2618a56a4b1417c4a63c4fff93e5f2af5e701449d4dbe686563ceaa84785f504"} Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.215546 4804 generic.go:334] "Generic (PLEG): container finished" podID="e5ccd477-88cd-4284-9de7-f336def1c7a1" containerID="c4388959eec5a5e0e8de102bea2b09564ba60fc2953cdfd791f11ae067628b67" exitCode=0 Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.215609 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5ccd477-88cd-4284-9de7-f336def1c7a1","Type":"ContainerDied","Data":"c4388959eec5a5e0e8de102bea2b09564ba60fc2953cdfd791f11ae067628b67"} Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.215641 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5ccd477-88cd-4284-9de7-f336def1c7a1","Type":"ContainerDied","Data":"9ab7cd127419d840e73931cd84e8a62cca6dbdb1c678768ac1433e7970f3f9a0"} Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.215664 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.215704 4804 scope.go:117] "RemoveContainer" containerID="2177d96b7ccc69abb21c72197299cdd7db75d5cbd8d109344470068cf476eeb5" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.247580 4804 scope.go:117] "RemoveContainer" containerID="c40d28d17d3f2c54c12b4cb0de4103cece798c21dc1c74faef6b897bc022c5c5" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.264869 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-sg-core-conf-yaml\") pod \"e5ccd477-88cd-4284-9de7-f336def1c7a1\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.264924 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-scripts\") pod \"e5ccd477-88cd-4284-9de7-f336def1c7a1\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.264949 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-combined-ca-bundle\") pod \"e5ccd477-88cd-4284-9de7-f336def1c7a1\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.265007 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jjjd\" (UniqueName: \"kubernetes.io/projected/e5ccd477-88cd-4284-9de7-f336def1c7a1-kube-api-access-8jjjd\") pod \"e5ccd477-88cd-4284-9de7-f336def1c7a1\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.265112 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5ccd477-88cd-4284-9de7-f336def1c7a1-run-httpd\") pod \"e5ccd477-88cd-4284-9de7-f336def1c7a1\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.265132 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-config-data\") pod \"e5ccd477-88cd-4284-9de7-f336def1c7a1\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.265395 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5ccd477-88cd-4284-9de7-f336def1c7a1-log-httpd\") pod \"e5ccd477-88cd-4284-9de7-f336def1c7a1\" (UID: \"e5ccd477-88cd-4284-9de7-f336def1c7a1\") " Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.268538 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5ccd477-88cd-4284-9de7-f336def1c7a1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e5ccd477-88cd-4284-9de7-f336def1c7a1" (UID: "e5ccd477-88cd-4284-9de7-f336def1c7a1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.268832 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5ccd477-88cd-4284-9de7-f336def1c7a1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e5ccd477-88cd-4284-9de7-f336def1c7a1" (UID: "e5ccd477-88cd-4284-9de7-f336def1c7a1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.279489 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5ccd477-88cd-4284-9de7-f336def1c7a1-kube-api-access-8jjjd" (OuterVolumeSpecName: "kube-api-access-8jjjd") pod "e5ccd477-88cd-4284-9de7-f336def1c7a1" (UID: "e5ccd477-88cd-4284-9de7-f336def1c7a1"). InnerVolumeSpecName "kube-api-access-8jjjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.282334 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-scripts" (OuterVolumeSpecName: "scripts") pod "e5ccd477-88cd-4284-9de7-f336def1c7a1" (UID: "e5ccd477-88cd-4284-9de7-f336def1c7a1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.307388 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e5ccd477-88cd-4284-9de7-f336def1c7a1" (UID: "e5ccd477-88cd-4284-9de7-f336def1c7a1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.316662 4804 scope.go:117] "RemoveContainer" containerID="c4388959eec5a5e0e8de102bea2b09564ba60fc2953cdfd791f11ae067628b67" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.361349 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5ccd477-88cd-4284-9de7-f336def1c7a1" (UID: "e5ccd477-88cd-4284-9de7-f336def1c7a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.369580 4804 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5ccd477-88cd-4284-9de7-f336def1c7a1-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.369613 4804 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.369623 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.369633 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.369642 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jjjd\" (UniqueName: \"kubernetes.io/projected/e5ccd477-88cd-4284-9de7-f336def1c7a1-kube-api-access-8jjjd\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.369651 4804 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5ccd477-88cd-4284-9de7-f336def1c7a1-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.375834 4804 scope.go:117] "RemoveContainer" containerID="2177d96b7ccc69abb21c72197299cdd7db75d5cbd8d109344470068cf476eeb5" Feb 17 13:47:23 crc kubenswrapper[4804]: E0217 13:47:23.376542 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2177d96b7ccc69abb21c72197299cdd7db75d5cbd8d109344470068cf476eeb5\": container with ID starting with 2177d96b7ccc69abb21c72197299cdd7db75d5cbd8d109344470068cf476eeb5 not found: ID does not exist" containerID="2177d96b7ccc69abb21c72197299cdd7db75d5cbd8d109344470068cf476eeb5" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.376598 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2177d96b7ccc69abb21c72197299cdd7db75d5cbd8d109344470068cf476eeb5"} err="failed to get container status \"2177d96b7ccc69abb21c72197299cdd7db75d5cbd8d109344470068cf476eeb5\": rpc error: code = NotFound desc = could not find container \"2177d96b7ccc69abb21c72197299cdd7db75d5cbd8d109344470068cf476eeb5\": container with ID starting with 2177d96b7ccc69abb21c72197299cdd7db75d5cbd8d109344470068cf476eeb5 not found: ID does not exist" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.376629 4804 scope.go:117] "RemoveContainer" containerID="c40d28d17d3f2c54c12b4cb0de4103cece798c21dc1c74faef6b897bc022c5c5" Feb 17 13:47:23 crc kubenswrapper[4804]: E0217 13:47:23.377137 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c40d28d17d3f2c54c12b4cb0de4103cece798c21dc1c74faef6b897bc022c5c5\": container with ID starting with c40d28d17d3f2c54c12b4cb0de4103cece798c21dc1c74faef6b897bc022c5c5 not found: ID does not exist" containerID="c40d28d17d3f2c54c12b4cb0de4103cece798c21dc1c74faef6b897bc022c5c5" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.377216 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c40d28d17d3f2c54c12b4cb0de4103cece798c21dc1c74faef6b897bc022c5c5"} err="failed to get container status \"c40d28d17d3f2c54c12b4cb0de4103cece798c21dc1c74faef6b897bc022c5c5\": rpc error: code = NotFound desc = could not find container \"c40d28d17d3f2c54c12b4cb0de4103cece798c21dc1c74faef6b897bc022c5c5\": container with ID starting with c40d28d17d3f2c54c12b4cb0de4103cece798c21dc1c74faef6b897bc022c5c5 not found: ID does not exist" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.377236 4804 scope.go:117] "RemoveContainer" containerID="c4388959eec5a5e0e8de102bea2b09564ba60fc2953cdfd791f11ae067628b67" Feb 17 13:47:23 crc kubenswrapper[4804]: E0217 13:47:23.382634 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4388959eec5a5e0e8de102bea2b09564ba60fc2953cdfd791f11ae067628b67\": container with ID starting with c4388959eec5a5e0e8de102bea2b09564ba60fc2953cdfd791f11ae067628b67 not found: ID does not exist" containerID="c4388959eec5a5e0e8de102bea2b09564ba60fc2953cdfd791f11ae067628b67" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.382674 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4388959eec5a5e0e8de102bea2b09564ba60fc2953cdfd791f11ae067628b67"} err="failed to get container status \"c4388959eec5a5e0e8de102bea2b09564ba60fc2953cdfd791f11ae067628b67\": rpc error: code = NotFound desc = could not find container \"c4388959eec5a5e0e8de102bea2b09564ba60fc2953cdfd791f11ae067628b67\": container with ID starting with c4388959eec5a5e0e8de102bea2b09564ba60fc2953cdfd791f11ae067628b67 not found: ID does not exist" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.384802 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-config-data" (OuterVolumeSpecName: "config-data") pod "e5ccd477-88cd-4284-9de7-f336def1c7a1" (UID: "e5ccd477-88cd-4284-9de7-f336def1c7a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.471791 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5ccd477-88cd-4284-9de7-f336def1c7a1-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.579713 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.594668 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.622845 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:47:23 crc kubenswrapper[4804]: E0217 13:47:23.623662 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5ccd477-88cd-4284-9de7-f336def1c7a1" containerName="proxy-httpd" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.623758 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5ccd477-88cd-4284-9de7-f336def1c7a1" containerName="proxy-httpd" Feb 17 13:47:23 crc kubenswrapper[4804]: E0217 13:47:23.623854 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5ccd477-88cd-4284-9de7-f336def1c7a1" containerName="sg-core" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.623924 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5ccd477-88cd-4284-9de7-f336def1c7a1" containerName="sg-core" Feb 17 13:47:23 crc kubenswrapper[4804]: E0217 13:47:23.623999 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5ccd477-88cd-4284-9de7-f336def1c7a1" containerName="ceilometer-notification-agent" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.624083 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5ccd477-88cd-4284-9de7-f336def1c7a1" containerName="ceilometer-notification-agent" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.624392 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5ccd477-88cd-4284-9de7-f336def1c7a1" containerName="ceilometer-notification-agent" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.624485 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5ccd477-88cd-4284-9de7-f336def1c7a1" containerName="sg-core" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.624575 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5ccd477-88cd-4284-9de7-f336def1c7a1" containerName="proxy-httpd" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.626727 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.634507 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.635110 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.641375 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.732402 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.798285 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-scripts\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.798331 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljpm8\" (UniqueName: \"kubernetes.io/projected/da1535f5-a225-489d-af6d-cbfa6042d239-kube-api-access-ljpm8\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.798365 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.798648 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-config-data\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.798685 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.798755 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da1535f5-a225-489d-af6d-cbfa6042d239-run-httpd\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.798809 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da1535f5-a225-489d-af6d-cbfa6042d239-log-httpd\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.900493 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-scripts\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.900769 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljpm8\" (UniqueName: \"kubernetes.io/projected/da1535f5-a225-489d-af6d-cbfa6042d239-kube-api-access-ljpm8\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.900866 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.900989 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-config-data\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.901081 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.901166 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da1535f5-a225-489d-af6d-cbfa6042d239-run-httpd\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.901250 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da1535f5-a225-489d-af6d-cbfa6042d239-log-httpd\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.902016 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da1535f5-a225-489d-af6d-cbfa6042d239-log-httpd\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.902169 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da1535f5-a225-489d-af6d-cbfa6042d239-run-httpd\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.908412 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-scripts\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.911499 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-config-data\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.919332 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.922903 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.924111 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljpm8\" (UniqueName: \"kubernetes.io/projected/da1535f5-a225-489d-af6d-cbfa6042d239-kube-api-access-ljpm8\") pod \"ceilometer-0\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " pod="openstack/ceilometer-0" Feb 17 13:47:23 crc kubenswrapper[4804]: I0217 13:47:23.983032 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:47:24 crc kubenswrapper[4804]: I0217 13:47:24.236530 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0586d6d2-92ba-4c34-9153-3de3fe22add2","Type":"ContainerStarted","Data":"75dd018c8e9cd8c6677bcd27c77a5f1fcc5d2bc1ba3b553480e69ca53b36f5e6"} Feb 17 13:47:24 crc kubenswrapper[4804]: I0217 13:47:24.238329 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6","Type":"ContainerStarted","Data":"2b63a870c70f085dee0bf900b7beba65015e3aff6e6541b29544712e34dd77a9"} Feb 17 13:47:24 crc kubenswrapper[4804]: I0217 13:47:24.292342 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" event={"ID":"737ac1d8-ad22-4a56-b203-eb2212949fb6","Type":"ContainerStarted","Data":"be3cfd9284bc924e079e815dce6ec8556a04ca7bbbe9c4fae1b1eb972152c0b7"} Feb 17 13:47:24 crc kubenswrapper[4804]: I0217 13:47:24.292558 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:24 crc kubenswrapper[4804]: I0217 13:47:24.316763 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" podStartSLOduration=3.316745933 podStartE2EDuration="3.316745933s" podCreationTimestamp="2026-02-17 13:47:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:24.314544544 +0000 UTC m=+1318.425963881" watchObservedRunningTime="2026-02-17 13:47:24.316745933 +0000 UTC m=+1318.428165270" Feb 17 13:47:24 crc kubenswrapper[4804]: I0217 13:47:24.564689 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:47:24 crc kubenswrapper[4804]: I0217 13:47:24.611751 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5ccd477-88cd-4284-9de7-f336def1c7a1" path="/var/lib/kubelet/pods/e5ccd477-88cd-4284-9de7-f336def1c7a1/volumes" Feb 17 13:47:24 crc kubenswrapper[4804]: I0217 13:47:24.873629 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-58989b55cb-zjfvf" podUID="85415d6a-8a5f-4b65-b182-2bfe221e8eee" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Feb 17 13:47:25 crc kubenswrapper[4804]: I0217 13:47:25.302298 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da1535f5-a225-489d-af6d-cbfa6042d239","Type":"ContainerStarted","Data":"1aabbda01fd4eb2f80fc3bbf09b7a922e4856861abbe2a6d98b91344740bf141"} Feb 17 13:47:25 crc kubenswrapper[4804]: I0217 13:47:25.302910 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da1535f5-a225-489d-af6d-cbfa6042d239","Type":"ContainerStarted","Data":"4b8e0eba24a3942ba5514c36c6f889a4738be910f9bdb4e385c1915be330d79c"} Feb 17 13:47:25 crc kubenswrapper[4804]: I0217 13:47:25.304882 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6","Type":"ContainerStarted","Data":"723476fd1d8f467255808440fe7e8799143ee2007a7f138345fcc04e2663bf99"} Feb 17 13:47:25 crc kubenswrapper[4804]: I0217 13:47:25.306790 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0586d6d2-92ba-4c34-9153-3de3fe22add2","Type":"ContainerStarted","Data":"761183841cb4a6313b54fea97dc2892653f9c6938944cc406dcf275dfc1eb3c6"} Feb 17 13:47:25 crc kubenswrapper[4804]: I0217 13:47:25.306954 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="0586d6d2-92ba-4c34-9153-3de3fe22add2" containerName="cinder-api-log" containerID="cri-o://75dd018c8e9cd8c6677bcd27c77a5f1fcc5d2bc1ba3b553480e69ca53b36f5e6" gracePeriod=30 Feb 17 13:47:25 crc kubenswrapper[4804]: I0217 13:47:25.306970 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="0586d6d2-92ba-4c34-9153-3de3fe22add2" containerName="cinder-api" containerID="cri-o://761183841cb4a6313b54fea97dc2892653f9c6938944cc406dcf275dfc1eb3c6" gracePeriod=30 Feb 17 13:47:25 crc kubenswrapper[4804]: I0217 13:47:25.332312 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.564690298 podStartE2EDuration="4.332295785s" podCreationTimestamp="2026-02-17 13:47:21 +0000 UTC" firstStartedPulling="2026-02-17 13:47:22.21823624 +0000 UTC m=+1316.329655577" lastFinishedPulling="2026-02-17 13:47:22.985841727 +0000 UTC m=+1317.097261064" observedRunningTime="2026-02-17 13:47:25.324550802 +0000 UTC m=+1319.435970139" watchObservedRunningTime="2026-02-17 13:47:25.332295785 +0000 UTC m=+1319.443715122" Feb 17 13:47:25 crc kubenswrapper[4804]: I0217 13:47:25.356050 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.356028932 podStartE2EDuration="4.356028932s" podCreationTimestamp="2026-02-17 13:47:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:25.351406416 +0000 UTC m=+1319.462825753" watchObservedRunningTime="2026-02-17 13:47:25.356028932 +0000 UTC m=+1319.467448269" Feb 17 13:47:26 crc kubenswrapper[4804]: I0217 13:47:26.318417 4804 generic.go:334] "Generic (PLEG): container finished" podID="0586d6d2-92ba-4c34-9153-3de3fe22add2" containerID="761183841cb4a6313b54fea97dc2892653f9c6938944cc406dcf275dfc1eb3c6" exitCode=0 Feb 17 13:47:26 crc kubenswrapper[4804]: I0217 13:47:26.318466 4804 generic.go:334] "Generic (PLEG): container finished" podID="0586d6d2-92ba-4c34-9153-3de3fe22add2" containerID="75dd018c8e9cd8c6677bcd27c77a5f1fcc5d2bc1ba3b553480e69ca53b36f5e6" exitCode=143 Feb 17 13:47:26 crc kubenswrapper[4804]: I0217 13:47:26.318504 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0586d6d2-92ba-4c34-9153-3de3fe22add2","Type":"ContainerDied","Data":"761183841cb4a6313b54fea97dc2892653f9c6938944cc406dcf275dfc1eb3c6"} Feb 17 13:47:26 crc kubenswrapper[4804]: I0217 13:47:26.318566 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0586d6d2-92ba-4c34-9153-3de3fe22add2","Type":"ContainerDied","Data":"75dd018c8e9cd8c6677bcd27c77a5f1fcc5d2bc1ba3b553480e69ca53b36f5e6"} Feb 17 13:47:26 crc kubenswrapper[4804]: I0217 13:47:26.778444 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 17 13:47:26 crc kubenswrapper[4804]: I0217 13:47:26.822026 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7cc7c97fdd-bhd7w" Feb 17 13:47:26 crc kubenswrapper[4804]: I0217 13:47:26.874857 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6955855558-kv2ld"] Feb 17 13:47:26 crc kubenswrapper[4804]: I0217 13:47:26.875078 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6955855558-kv2ld" podUID="410af4be-4a66-404d-9809-f58444bc6473" containerName="barbican-api-log" containerID="cri-o://3225329e6ab93eac20c2d9227e1f3df46c5bdbdd2affe08906fba20733bf989a" gracePeriod=30 Feb 17 13:47:26 crc kubenswrapper[4804]: I0217 13:47:26.875457 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6955855558-kv2ld" podUID="410af4be-4a66-404d-9809-f58444bc6473" containerName="barbican-api" containerID="cri-o://7797ca6ae3158f51032a378dceeadfa4a4aab48a558972be10499d12d6917e06" gracePeriod=30 Feb 17 13:47:26 crc kubenswrapper[4804]: I0217 13:47:26.967232 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.198430 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.316749 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-77797bd57-r2gff"] Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.316982 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-77797bd57-r2gff" podUID="3dd4a1b7-336a-4b57-a341-a413ccd8a223" containerName="neutron-api" containerID="cri-o://c0c8c975211dcbac6c6d2312a029db9539af9a8ff4aee589c0e35e3c01c70cef" gracePeriod=30 Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.317586 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-77797bd57-r2gff" podUID="3dd4a1b7-336a-4b57-a341-a413ccd8a223" containerName="neutron-httpd" containerID="cri-o://b1a36e61c5f168281044a5910102bb23fe1a6fd32062c6f2f8cfa71c080e8e7c" gracePeriod=30 Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.323595 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.357175 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-c576cfd85-655nj"] Feb 17 13:47:27 crc kubenswrapper[4804]: E0217 13:47:27.357652 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0586d6d2-92ba-4c34-9153-3de3fe22add2" containerName="cinder-api" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.357672 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="0586d6d2-92ba-4c34-9153-3de3fe22add2" containerName="cinder-api" Feb 17 13:47:27 crc kubenswrapper[4804]: E0217 13:47:27.357707 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0586d6d2-92ba-4c34-9153-3de3fe22add2" containerName="cinder-api-log" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.357715 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="0586d6d2-92ba-4c34-9153-3de3fe22add2" containerName="cinder-api-log" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.357929 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="0586d6d2-92ba-4c34-9153-3de3fe22add2" containerName="cinder-api-log" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.357948 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="0586d6d2-92ba-4c34-9153-3de3fe22add2" containerName="cinder-api" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.359028 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.359775 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da1535f5-a225-489d-af6d-cbfa6042d239","Type":"ContainerStarted","Data":"092874b0b3e14392b931fcb3901d9071706161752bbb5877b56ec700010be97b"} Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.367054 4804 generic.go:334] "Generic (PLEG): container finished" podID="410af4be-4a66-404d-9809-f58444bc6473" containerID="3225329e6ab93eac20c2d9227e1f3df46c5bdbdd2affe08906fba20733bf989a" exitCode=143 Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.367100 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6955855558-kv2ld" event={"ID":"410af4be-4a66-404d-9809-f58444bc6473","Type":"ContainerDied","Data":"3225329e6ab93eac20c2d9227e1f3df46c5bdbdd2affe08906fba20733bf989a"} Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.378506 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.378745 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0586d6d2-92ba-4c34-9153-3de3fe22add2","Type":"ContainerDied","Data":"3fb4a87b299d4238d785f3f50f31c965eb4578228964e6ab3bbb8b0fd289c1da"} Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.378785 4804 scope.go:117] "RemoveContainer" containerID="761183841cb4a6313b54fea97dc2892653f9c6938944cc406dcf275dfc1eb3c6" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.381474 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-config-data-custom\") pod \"0586d6d2-92ba-4c34-9153-3de3fe22add2\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.381549 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-combined-ca-bundle\") pod \"0586d6d2-92ba-4c34-9153-3de3fe22add2\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.381865 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-config-data\") pod \"0586d6d2-92ba-4c34-9153-3de3fe22add2\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.381914 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-scripts\") pod \"0586d6d2-92ba-4c34-9153-3de3fe22add2\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.382024 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0586d6d2-92ba-4c34-9153-3de3fe22add2-etc-machine-id\") pod \"0586d6d2-92ba-4c34-9153-3de3fe22add2\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.382131 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfrmd\" (UniqueName: \"kubernetes.io/projected/0586d6d2-92ba-4c34-9153-3de3fe22add2-kube-api-access-wfrmd\") pod \"0586d6d2-92ba-4c34-9153-3de3fe22add2\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.382240 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0586d6d2-92ba-4c34-9153-3de3fe22add2-logs\") pod \"0586d6d2-92ba-4c34-9153-3de3fe22add2\" (UID: \"0586d6d2-92ba-4c34-9153-3de3fe22add2\") " Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.382538 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0586d6d2-92ba-4c34-9153-3de3fe22add2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0586d6d2-92ba-4c34-9153-3de3fe22add2" (UID: "0586d6d2-92ba-4c34-9153-3de3fe22add2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.383332 4804 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0586d6d2-92ba-4c34-9153-3de3fe22add2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.383962 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0586d6d2-92ba-4c34-9153-3de3fe22add2-logs" (OuterVolumeSpecName: "logs") pod "0586d6d2-92ba-4c34-9153-3de3fe22add2" (UID: "0586d6d2-92ba-4c34-9153-3de3fe22add2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.384038 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c576cfd85-655nj"] Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.391102 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0586d6d2-92ba-4c34-9153-3de3fe22add2" (UID: "0586d6d2-92ba-4c34-9153-3de3fe22add2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.391132 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-scripts" (OuterVolumeSpecName: "scripts") pod "0586d6d2-92ba-4c34-9153-3de3fe22add2" (UID: "0586d6d2-92ba-4c34-9153-3de3fe22add2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.392439 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0586d6d2-92ba-4c34-9153-3de3fe22add2-kube-api-access-wfrmd" (OuterVolumeSpecName: "kube-api-access-wfrmd") pod "0586d6d2-92ba-4c34-9153-3de3fe22add2" (UID: "0586d6d2-92ba-4c34-9153-3de3fe22add2"). InnerVolumeSpecName "kube-api-access-wfrmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.428223 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-77797bd57-r2gff" podUID="3dd4a1b7-336a-4b57-a341-a413ccd8a223" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.155:9696/\": read tcp 10.217.0.2:54846->10.217.0.155:9696: read: connection reset by peer" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.439835 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0586d6d2-92ba-4c34-9153-3de3fe22add2" (UID: "0586d6d2-92ba-4c34-9153-3de3fe22add2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.468356 4804 scope.go:117] "RemoveContainer" containerID="75dd018c8e9cd8c6677bcd27c77a5f1fcc5d2bc1ba3b553480e69ca53b36f5e6" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.483978 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-config-data" (OuterVolumeSpecName: "config-data") pod "0586d6d2-92ba-4c34-9153-3de3fe22add2" (UID: "0586d6d2-92ba-4c34-9153-3de3fe22add2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.484718 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-combined-ca-bundle\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.484775 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-config\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.484817 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nnts\" (UniqueName: \"kubernetes.io/projected/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-kube-api-access-2nnts\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.484893 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-httpd-config\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.484958 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-public-tls-certs\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.484998 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-internal-tls-certs\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.485075 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-ovndb-tls-certs\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.485143 4804 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.485155 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.485163 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.485171 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0586d6d2-92ba-4c34-9153-3de3fe22add2-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.485180 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfrmd\" (UniqueName: \"kubernetes.io/projected/0586d6d2-92ba-4c34-9153-3de3fe22add2-kube-api-access-wfrmd\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.485190 4804 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0586d6d2-92ba-4c34-9153-3de3fe22add2-logs\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.591263 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-public-tls-certs\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.591319 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-internal-tls-certs\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.591369 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-ovndb-tls-certs\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.591403 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-combined-ca-bundle\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.591429 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-config\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.591460 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nnts\" (UniqueName: \"kubernetes.io/projected/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-kube-api-access-2nnts\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.591500 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-httpd-config\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.596781 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-combined-ca-bundle\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.597096 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-httpd-config\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.597217 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-public-tls-certs\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.597977 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-config\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.601416 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-ovndb-tls-certs\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.602137 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-internal-tls-certs\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.610968 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nnts\" (UniqueName: \"kubernetes.io/projected/fb86b3d7-c6a3-43d5-a8da-805aa7d73a66-kube-api-access-2nnts\") pod \"neutron-c576cfd85-655nj\" (UID: \"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66\") " pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.684006 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.724368 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.741300 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.807797 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.810994 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.816288 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.816605 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.816708 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.819610 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.907669 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-etc-machine-id\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.907741 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.907778 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-public-tls-certs\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.907815 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmhsk\" (UniqueName: \"kubernetes.io/projected/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-kube-api-access-mmhsk\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.907888 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-config-data\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.907935 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-config-data-custom\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.907961 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-scripts\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.908010 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:27 crc kubenswrapper[4804]: I0217 13:47:27.908188 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-logs\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.010187 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-logs\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.010384 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-etc-machine-id\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.010422 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.010451 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-public-tls-certs\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.010487 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmhsk\" (UniqueName: \"kubernetes.io/projected/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-kube-api-access-mmhsk\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.010518 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-config-data\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.010560 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-config-data-custom\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.010606 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-scripts\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.010457 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-etc-machine-id\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.010675 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.013998 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-logs\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.017702 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-scripts\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.017868 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-public-tls-certs\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.019793 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.021788 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-config-data\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.030782 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-config-data-custom\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.031865 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.037789 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmhsk\" (UniqueName: \"kubernetes.io/projected/271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92-kube-api-access-mmhsk\") pod \"cinder-api-0\" (UID: \"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92\") " pod="openstack/cinder-api-0" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.136550 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.338179 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c576cfd85-655nj"] Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.408550 4804 generic.go:334] "Generic (PLEG): container finished" podID="3dd4a1b7-336a-4b57-a341-a413ccd8a223" containerID="b1a36e61c5f168281044a5910102bb23fe1a6fd32062c6f2f8cfa71c080e8e7c" exitCode=0 Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.408742 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77797bd57-r2gff" event={"ID":"3dd4a1b7-336a-4b57-a341-a413ccd8a223","Type":"ContainerDied","Data":"b1a36e61c5f168281044a5910102bb23fe1a6fd32062c6f2f8cfa71c080e8e7c"} Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.411733 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c576cfd85-655nj" event={"ID":"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66","Type":"ContainerStarted","Data":"80395f7cf08a3f295690844faf186f0d36b5ab94d7a70807624bfa83ff416d77"} Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.590777 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0586d6d2-92ba-4c34-9153-3de3fe22add2" path="/var/lib/kubelet/pods/0586d6d2-92ba-4c34-9153-3de3fe22add2/volumes" Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.701537 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 17 13:47:28 crc kubenswrapper[4804]: I0217 13:47:28.984587 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-77797bd57-r2gff" podUID="3dd4a1b7-336a-4b57-a341-a413ccd8a223" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.155:9696/\": dial tcp 10.217.0.155:9696: connect: connection refused" Feb 17 13:47:29 crc kubenswrapper[4804]: I0217 13:47:29.454375 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92","Type":"ContainerStarted","Data":"7efbb62ff61c35ea16cabda7c84e1279ca8d2af07fb7491fd2094190ad1846f1"} Feb 17 13:47:29 crc kubenswrapper[4804]: I0217 13:47:29.489487 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da1535f5-a225-489d-af6d-cbfa6042d239","Type":"ContainerStarted","Data":"99fe5ef4d5a27697bd3d835ca4e7242c9ea11c1b7e9ff93b4ae3d3d3447f90ca"} Feb 17 13:47:29 crc kubenswrapper[4804]: I0217 13:47:29.508459 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c576cfd85-655nj" event={"ID":"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66","Type":"ContainerStarted","Data":"32bc21f5abd2fb63b5c3e9a028dd07b9e583eb42bdf92ccbe33b8b4f924c450d"} Feb 17 13:47:29 crc kubenswrapper[4804]: I0217 13:47:29.508687 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c576cfd85-655nj" event={"ID":"fb86b3d7-c6a3-43d5-a8da-805aa7d73a66","Type":"ContainerStarted","Data":"6cdee8a2a44746e99685b57fd02fda0803227405c0ea737ecba590dd9cb4f9d0"} Feb 17 13:47:29 crc kubenswrapper[4804]: I0217 13:47:29.510222 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:29 crc kubenswrapper[4804]: I0217 13:47:29.541885 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-c576cfd85-655nj" podStartSLOduration=2.541864628 podStartE2EDuration="2.541864628s" podCreationTimestamp="2026-02-17 13:47:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:29.535667122 +0000 UTC m=+1323.647086459" watchObservedRunningTime="2026-02-17 13:47:29.541864628 +0000 UTC m=+1323.653283965" Feb 17 13:47:29 crc kubenswrapper[4804]: I0217 13:47:29.602076 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-9ffb6f5c6-fczv5" Feb 17 13:47:29 crc kubenswrapper[4804]: I0217 13:47:29.668215 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-58989b55cb-zjfvf"] Feb 17 13:47:29 crc kubenswrapper[4804]: I0217 13:47:29.668883 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-58989b55cb-zjfvf" podUID="85415d6a-8a5f-4b65-b182-2bfe221e8eee" containerName="horizon-log" containerID="cri-o://d85afc401ad87104d844d4c1c5c56bfe2224eb996820680ca9a6f48ab88469e3" gracePeriod=30 Feb 17 13:47:29 crc kubenswrapper[4804]: I0217 13:47:29.669321 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-58989b55cb-zjfvf" podUID="85415d6a-8a5f-4b65-b182-2bfe221e8eee" containerName="horizon" containerID="cri-o://c565845aca9ef2b15231e4cf93626b2f7262c579528562e984d56c20dda93983" gracePeriod=30 Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.107980 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6955855558-kv2ld" podUID="410af4be-4a66-404d-9809-f58444bc6473" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": read tcp 10.217.0.2:54514->10.217.0.163:9311: read: connection reset by peer" Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.108034 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6955855558-kv2ld" podUID="410af4be-4a66-404d-9809-f58444bc6473" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": read tcp 10.217.0.2:54504->10.217.0.163:9311: read: connection reset by peer" Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.519275 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92","Type":"ContainerStarted","Data":"c7468c9ea7959857ebbdf390fc902dc07aa8ed275c14ba41579ead40e9970dbe"} Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.521510 4804 generic.go:334] "Generic (PLEG): container finished" podID="85415d6a-8a5f-4b65-b182-2bfe221e8eee" containerID="c565845aca9ef2b15231e4cf93626b2f7262c579528562e984d56c20dda93983" exitCode=0 Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.521570 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58989b55cb-zjfvf" event={"ID":"85415d6a-8a5f-4b65-b182-2bfe221e8eee","Type":"ContainerDied","Data":"c565845aca9ef2b15231e4cf93626b2f7262c579528562e984d56c20dda93983"} Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.525610 4804 generic.go:334] "Generic (PLEG): container finished" podID="410af4be-4a66-404d-9809-f58444bc6473" containerID="7797ca6ae3158f51032a378dceeadfa4a4aab48a558972be10499d12d6917e06" exitCode=0 Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.526538 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6955855558-kv2ld" event={"ID":"410af4be-4a66-404d-9809-f58444bc6473","Type":"ContainerDied","Data":"7797ca6ae3158f51032a378dceeadfa4a4aab48a558972be10499d12d6917e06"} Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.728814 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.879299 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/410af4be-4a66-404d-9809-f58444bc6473-logs\") pod \"410af4be-4a66-404d-9809-f58444bc6473\" (UID: \"410af4be-4a66-404d-9809-f58444bc6473\") " Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.879459 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/410af4be-4a66-404d-9809-f58444bc6473-config-data-custom\") pod \"410af4be-4a66-404d-9809-f58444bc6473\" (UID: \"410af4be-4a66-404d-9809-f58444bc6473\") " Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.879657 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/410af4be-4a66-404d-9809-f58444bc6473-combined-ca-bundle\") pod \"410af4be-4a66-404d-9809-f58444bc6473\" (UID: \"410af4be-4a66-404d-9809-f58444bc6473\") " Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.880001 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/410af4be-4a66-404d-9809-f58444bc6473-logs" (OuterVolumeSpecName: "logs") pod "410af4be-4a66-404d-9809-f58444bc6473" (UID: "410af4be-4a66-404d-9809-f58444bc6473"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.880525 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/410af4be-4a66-404d-9809-f58444bc6473-config-data\") pod \"410af4be-4a66-404d-9809-f58444bc6473\" (UID: \"410af4be-4a66-404d-9809-f58444bc6473\") " Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.880613 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks729\" (UniqueName: \"kubernetes.io/projected/410af4be-4a66-404d-9809-f58444bc6473-kube-api-access-ks729\") pod \"410af4be-4a66-404d-9809-f58444bc6473\" (UID: \"410af4be-4a66-404d-9809-f58444bc6473\") " Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.881212 4804 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/410af4be-4a66-404d-9809-f58444bc6473-logs\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.907949 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/410af4be-4a66-404d-9809-f58444bc6473-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "410af4be-4a66-404d-9809-f58444bc6473" (UID: "410af4be-4a66-404d-9809-f58444bc6473"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.911516 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/410af4be-4a66-404d-9809-f58444bc6473-kube-api-access-ks729" (OuterVolumeSpecName: "kube-api-access-ks729") pod "410af4be-4a66-404d-9809-f58444bc6473" (UID: "410af4be-4a66-404d-9809-f58444bc6473"). InnerVolumeSpecName "kube-api-access-ks729". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.932569 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/410af4be-4a66-404d-9809-f58444bc6473-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "410af4be-4a66-404d-9809-f58444bc6473" (UID: "410af4be-4a66-404d-9809-f58444bc6473"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.958870 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/410af4be-4a66-404d-9809-f58444bc6473-config-data" (OuterVolumeSpecName: "config-data") pod "410af4be-4a66-404d-9809-f58444bc6473" (UID: "410af4be-4a66-404d-9809-f58444bc6473"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.983333 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ks729\" (UniqueName: \"kubernetes.io/projected/410af4be-4a66-404d-9809-f58444bc6473-kube-api-access-ks729\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.983386 4804 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/410af4be-4a66-404d-9809-f58444bc6473-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.983397 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/410af4be-4a66-404d-9809-f58444bc6473-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:30 crc kubenswrapper[4804]: I0217 13:47:30.983406 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/410af4be-4a66-404d-9809-f58444bc6473-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:31 crc kubenswrapper[4804]: I0217 13:47:31.540359 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6955855558-kv2ld" event={"ID":"410af4be-4a66-404d-9809-f58444bc6473","Type":"ContainerDied","Data":"0345398b72b1e224ae72d70254cce0d2edce23c11bae1237519d2ed6af3adbe1"} Feb 17 13:47:31 crc kubenswrapper[4804]: I0217 13:47:31.540773 4804 scope.go:117] "RemoveContainer" containerID="7797ca6ae3158f51032a378dceeadfa4a4aab48a558972be10499d12d6917e06" Feb 17 13:47:31 crc kubenswrapper[4804]: I0217 13:47:31.540413 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6955855558-kv2ld" Feb 17 13:47:31 crc kubenswrapper[4804]: I0217 13:47:31.543683 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92","Type":"ContainerStarted","Data":"67d0f359ca6c010971e9a845ebb55aaabb794027eca2c4ed6c73d3a10f7dc586"} Feb 17 13:47:31 crc kubenswrapper[4804]: I0217 13:47:31.543902 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 17 13:47:31 crc kubenswrapper[4804]: I0217 13:47:31.550999 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da1535f5-a225-489d-af6d-cbfa6042d239","Type":"ContainerStarted","Data":"35ecd78bd9969e0d81d74d255802d84e8782104abf917998c0f5c9f7a1d3435a"} Feb 17 13:47:31 crc kubenswrapper[4804]: I0217 13:47:31.551058 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 13:47:31 crc kubenswrapper[4804]: I0217 13:47:31.579306 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.579286631 podStartE2EDuration="4.579286631s" podCreationTimestamp="2026-02-17 13:47:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:31.574296784 +0000 UTC m=+1325.685716151" watchObservedRunningTime="2026-02-17 13:47:31.579286631 +0000 UTC m=+1325.690705978" Feb 17 13:47:31 crc kubenswrapper[4804]: I0217 13:47:31.603447 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.228039787 podStartE2EDuration="8.603427889s" podCreationTimestamp="2026-02-17 13:47:23 +0000 UTC" firstStartedPulling="2026-02-17 13:47:24.605360165 +0000 UTC m=+1318.716779502" lastFinishedPulling="2026-02-17 13:47:30.980748267 +0000 UTC m=+1325.092167604" observedRunningTime="2026-02-17 13:47:31.601995055 +0000 UTC m=+1325.713414412" watchObservedRunningTime="2026-02-17 13:47:31.603427889 +0000 UTC m=+1325.714847226" Feb 17 13:47:31 crc kubenswrapper[4804]: I0217 13:47:31.621399 4804 scope.go:117] "RemoveContainer" containerID="3225329e6ab93eac20c2d9227e1f3df46c5bdbdd2affe08906fba20733bf989a" Feb 17 13:47:31 crc kubenswrapper[4804]: I0217 13:47:31.630174 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6955855558-kv2ld"] Feb 17 13:47:31 crc kubenswrapper[4804]: I0217 13:47:31.637698 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6955855558-kv2ld"] Feb 17 13:47:31 crc kubenswrapper[4804]: I0217 13:47:31.867438 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:47:31 crc kubenswrapper[4804]: I0217 13:47:31.984632 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-cxg2s"] Feb 17 13:47:31 crc kubenswrapper[4804]: I0217 13:47:31.985140 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" podUID="b85d5058-b075-42ca-8d69-a86cfc1bd01c" containerName="dnsmasq-dns" containerID="cri-o://d980c32d83966a44bf55958cd6329cf2bd80a3344dee8dab3f8264dc4b275f31" gracePeriod=10 Feb 17 13:47:32 crc kubenswrapper[4804]: I0217 13:47:32.324211 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 17 13:47:32 crc kubenswrapper[4804]: I0217 13:47:32.384647 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" podUID="b85d5058-b075-42ca-8d69-a86cfc1bd01c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.162:5353: connect: connection refused" Feb 17 13:47:32 crc kubenswrapper[4804]: I0217 13:47:32.408174 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 13:47:32 crc kubenswrapper[4804]: I0217 13:47:32.561634 4804 generic.go:334] "Generic (PLEG): container finished" podID="b85d5058-b075-42ca-8d69-a86cfc1bd01c" containerID="d980c32d83966a44bf55958cd6329cf2bd80a3344dee8dab3f8264dc4b275f31" exitCode=0 Feb 17 13:47:32 crc kubenswrapper[4804]: I0217 13:47:32.562365 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" event={"ID":"b85d5058-b075-42ca-8d69-a86cfc1bd01c","Type":"ContainerDied","Data":"d980c32d83966a44bf55958cd6329cf2bd80a3344dee8dab3f8264dc4b275f31"} Feb 17 13:47:32 crc kubenswrapper[4804]: I0217 13:47:32.562891 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="32ab16b4-e683-4bbe-97b3-6a2c1915d9f6" containerName="cinder-scheduler" containerID="cri-o://2b63a870c70f085dee0bf900b7beba65015e3aff6e6541b29544712e34dd77a9" gracePeriod=30 Feb 17 13:47:32 crc kubenswrapper[4804]: I0217 13:47:32.563086 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="32ab16b4-e683-4bbe-97b3-6a2c1915d9f6" containerName="probe" containerID="cri-o://723476fd1d8f467255808440fe7e8799143ee2007a7f138345fcc04e2663bf99" gracePeriod=30 Feb 17 13:47:32 crc kubenswrapper[4804]: I0217 13:47:32.591625 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="410af4be-4a66-404d-9809-f58444bc6473" path="/var/lib/kubelet/pods/410af4be-4a66-404d-9809-f58444bc6473/volumes" Feb 17 13:47:32 crc kubenswrapper[4804]: I0217 13:47:32.995121 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.129451 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-ovsdbserver-nb\") pod \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.130288 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-dns-svc\") pod \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.130793 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fgvs\" (UniqueName: \"kubernetes.io/projected/b85d5058-b075-42ca-8d69-a86cfc1bd01c-kube-api-access-4fgvs\") pod \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.130902 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-dns-swift-storage-0\") pod \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.131066 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-config\") pod \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.131216 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-ovsdbserver-sb\") pod \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\" (UID: \"b85d5058-b075-42ca-8d69-a86cfc1bd01c\") " Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.139618 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b85d5058-b075-42ca-8d69-a86cfc1bd01c-kube-api-access-4fgvs" (OuterVolumeSpecName: "kube-api-access-4fgvs") pod "b85d5058-b075-42ca-8d69-a86cfc1bd01c" (UID: "b85d5058-b075-42ca-8d69-a86cfc1bd01c"). InnerVolumeSpecName "kube-api-access-4fgvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.180561 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b85d5058-b075-42ca-8d69-a86cfc1bd01c" (UID: "b85d5058-b075-42ca-8d69-a86cfc1bd01c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.207004 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b85d5058-b075-42ca-8d69-a86cfc1bd01c" (UID: "b85d5058-b075-42ca-8d69-a86cfc1bd01c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.210806 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-config" (OuterVolumeSpecName: "config") pod "b85d5058-b075-42ca-8d69-a86cfc1bd01c" (UID: "b85d5058-b075-42ca-8d69-a86cfc1bd01c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.234052 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fgvs\" (UniqueName: \"kubernetes.io/projected/b85d5058-b075-42ca-8d69-a86cfc1bd01c-kube-api-access-4fgvs\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.234087 4804 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.234097 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.234105 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.234693 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b85d5058-b075-42ca-8d69-a86cfc1bd01c" (UID: "b85d5058-b075-42ca-8d69-a86cfc1bd01c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.252831 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b85d5058-b075-42ca-8d69-a86cfc1bd01c" (UID: "b85d5058-b075-42ca-8d69-a86cfc1bd01c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.335699 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.335732 4804 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b85d5058-b075-42ca-8d69-a86cfc1bd01c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.552717 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.572830 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.572892 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-cxg2s" event={"ID":"b85d5058-b075-42ca-8d69-a86cfc1bd01c","Type":"ContainerDied","Data":"1ceb04ce2633cdf168f7ec2c7223a7b5436da513a4112c0b4cec53ae79c55d6e"} Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.573727 4804 scope.go:117] "RemoveContainer" containerID="d980c32d83966a44bf55958cd6329cf2bd80a3344dee8dab3f8264dc4b275f31" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.580963 4804 generic.go:334] "Generic (PLEG): container finished" podID="32ab16b4-e683-4bbe-97b3-6a2c1915d9f6" containerID="723476fd1d8f467255808440fe7e8799143ee2007a7f138345fcc04e2663bf99" exitCode=0 Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.581274 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6","Type":"ContainerDied","Data":"723476fd1d8f467255808440fe7e8799143ee2007a7f138345fcc04e2663bf99"} Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.587520 4804 generic.go:334] "Generic (PLEG): container finished" podID="3dd4a1b7-336a-4b57-a341-a413ccd8a223" containerID="c0c8c975211dcbac6c6d2312a029db9539af9a8ff4aee589c0e35e3c01c70cef" exitCode=0 Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.587573 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77797bd57-r2gff" event={"ID":"3dd4a1b7-336a-4b57-a341-a413ccd8a223","Type":"ContainerDied","Data":"c0c8c975211dcbac6c6d2312a029db9539af9a8ff4aee589c0e35e3c01c70cef"} Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.587594 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77797bd57-r2gff" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.587608 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77797bd57-r2gff" event={"ID":"3dd4a1b7-336a-4b57-a341-a413ccd8a223","Type":"ContainerDied","Data":"b27dcb323e9a77c57f04bfd3aad2ceaaa35b5cea105117b952a32b3cda64f464"} Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.604644 4804 scope.go:117] "RemoveContainer" containerID="61ec7f44479bb9daa4c9c91948a35a994fd304cc644764be5a4bbb119a672347" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.635283 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-cxg2s"] Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.640349 4804 scope.go:117] "RemoveContainer" containerID="b1a36e61c5f168281044a5910102bb23fe1a6fd32062c6f2f8cfa71c080e8e7c" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.647670 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-cxg2s"] Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.659211 4804 scope.go:117] "RemoveContainer" containerID="c0c8c975211dcbac6c6d2312a029db9539af9a8ff4aee589c0e35e3c01c70cef" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.685173 4804 scope.go:117] "RemoveContainer" containerID="b1a36e61c5f168281044a5910102bb23fe1a6fd32062c6f2f8cfa71c080e8e7c" Feb 17 13:47:33 crc kubenswrapper[4804]: E0217 13:47:33.685653 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1a36e61c5f168281044a5910102bb23fe1a6fd32062c6f2f8cfa71c080e8e7c\": container with ID starting with b1a36e61c5f168281044a5910102bb23fe1a6fd32062c6f2f8cfa71c080e8e7c not found: ID does not exist" containerID="b1a36e61c5f168281044a5910102bb23fe1a6fd32062c6f2f8cfa71c080e8e7c" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.685697 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1a36e61c5f168281044a5910102bb23fe1a6fd32062c6f2f8cfa71c080e8e7c"} err="failed to get container status \"b1a36e61c5f168281044a5910102bb23fe1a6fd32062c6f2f8cfa71c080e8e7c\": rpc error: code = NotFound desc = could not find container \"b1a36e61c5f168281044a5910102bb23fe1a6fd32062c6f2f8cfa71c080e8e7c\": container with ID starting with b1a36e61c5f168281044a5910102bb23fe1a6fd32062c6f2f8cfa71c080e8e7c not found: ID does not exist" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.685719 4804 scope.go:117] "RemoveContainer" containerID="c0c8c975211dcbac6c6d2312a029db9539af9a8ff4aee589c0e35e3c01c70cef" Feb 17 13:47:33 crc kubenswrapper[4804]: E0217 13:47:33.686258 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0c8c975211dcbac6c6d2312a029db9539af9a8ff4aee589c0e35e3c01c70cef\": container with ID starting with c0c8c975211dcbac6c6d2312a029db9539af9a8ff4aee589c0e35e3c01c70cef not found: ID does not exist" containerID="c0c8c975211dcbac6c6d2312a029db9539af9a8ff4aee589c0e35e3c01c70cef" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.686283 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0c8c975211dcbac6c6d2312a029db9539af9a8ff4aee589c0e35e3c01c70cef"} err="failed to get container status \"c0c8c975211dcbac6c6d2312a029db9539af9a8ff4aee589c0e35e3c01c70cef\": rpc error: code = NotFound desc = could not find container \"c0c8c975211dcbac6c6d2312a029db9539af9a8ff4aee589c0e35e3c01c70cef\": container with ID starting with c0c8c975211dcbac6c6d2312a029db9539af9a8ff4aee589c0e35e3c01c70cef not found: ID does not exist" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.742163 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-ovndb-tls-certs\") pod \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.742276 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-public-tls-certs\") pod \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.742304 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-httpd-config\") pod \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.742331 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-internal-tls-certs\") pod \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.742358 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-combined-ca-bundle\") pod \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.742494 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6642c\" (UniqueName: \"kubernetes.io/projected/3dd4a1b7-336a-4b57-a341-a413ccd8a223-kube-api-access-6642c\") pod \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.742567 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-config\") pod \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\" (UID: \"3dd4a1b7-336a-4b57-a341-a413ccd8a223\") " Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.757173 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dd4a1b7-336a-4b57-a341-a413ccd8a223-kube-api-access-6642c" (OuterVolumeSpecName: "kube-api-access-6642c") pod "3dd4a1b7-336a-4b57-a341-a413ccd8a223" (UID: "3dd4a1b7-336a-4b57-a341-a413ccd8a223"). InnerVolumeSpecName "kube-api-access-6642c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.760072 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "3dd4a1b7-336a-4b57-a341-a413ccd8a223" (UID: "3dd4a1b7-336a-4b57-a341-a413ccd8a223"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.786497 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-config" (OuterVolumeSpecName: "config") pod "3dd4a1b7-336a-4b57-a341-a413ccd8a223" (UID: "3dd4a1b7-336a-4b57-a341-a413ccd8a223"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.792113 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3dd4a1b7-336a-4b57-a341-a413ccd8a223" (UID: "3dd4a1b7-336a-4b57-a341-a413ccd8a223"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.806733 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3dd4a1b7-336a-4b57-a341-a413ccd8a223" (UID: "3dd4a1b7-336a-4b57-a341-a413ccd8a223"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.807307 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3dd4a1b7-336a-4b57-a341-a413ccd8a223" (UID: "3dd4a1b7-336a-4b57-a341-a413ccd8a223"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.819792 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "3dd4a1b7-336a-4b57-a341-a413ccd8a223" (UID: "3dd4a1b7-336a-4b57-a341-a413ccd8a223"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.846374 4804 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.846568 4804 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.846680 4804 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.846753 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.846812 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6642c\" (UniqueName: \"kubernetes.io/projected/3dd4a1b7-336a-4b57-a341-a413ccd8a223-kube-api-access-6642c\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.846867 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:33 crc kubenswrapper[4804]: I0217 13:47:33.846996 4804 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd4a1b7-336a-4b57-a341-a413ccd8a223-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:34 crc kubenswrapper[4804]: I0217 13:47:34.006667 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-77797bd57-r2gff"] Feb 17 13:47:34 crc kubenswrapper[4804]: I0217 13:47:34.016416 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-77797bd57-r2gff"] Feb 17 13:47:34 crc kubenswrapper[4804]: I0217 13:47:34.589402 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dd4a1b7-336a-4b57-a341-a413ccd8a223" path="/var/lib/kubelet/pods/3dd4a1b7-336a-4b57-a341-a413ccd8a223/volumes" Feb 17 13:47:34 crc kubenswrapper[4804]: I0217 13:47:34.590471 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b85d5058-b075-42ca-8d69-a86cfc1bd01c" path="/var/lib/kubelet/pods/b85d5058-b075-42ca-8d69-a86cfc1bd01c/volumes" Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.629704 4804 generic.go:334] "Generic (PLEG): container finished" podID="32ab16b4-e683-4bbe-97b3-6a2c1915d9f6" containerID="2b63a870c70f085dee0bf900b7beba65015e3aff6e6541b29544712e34dd77a9" exitCode=0 Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.629808 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6","Type":"ContainerDied","Data":"2b63a870c70f085dee0bf900b7beba65015e3aff6e6541b29544712e34dd77a9"} Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.630303 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6","Type":"ContainerDied","Data":"2618a56a4b1417c4a63c4fff93e5f2af5e701449d4dbe686563ceaa84785f504"} Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.630320 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2618a56a4b1417c4a63c4fff93e5f2af5e701449d4dbe686563ceaa84785f504" Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.635990 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.799772 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-etc-machine-id\") pod \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.799950 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-scripts\") pod \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.800042 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-config-data-custom\") pod \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.800081 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-config-data\") pod \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.800117 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-combined-ca-bundle\") pod \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.800179 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4p6n\" (UniqueName: \"kubernetes.io/projected/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-kube-api-access-t4p6n\") pod \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\" (UID: \"32ab16b4-e683-4bbe-97b3-6a2c1915d9f6\") " Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.801192 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "32ab16b4-e683-4bbe-97b3-6a2c1915d9f6" (UID: "32ab16b4-e683-4bbe-97b3-6a2c1915d9f6"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.807725 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-scripts" (OuterVolumeSpecName: "scripts") pod "32ab16b4-e683-4bbe-97b3-6a2c1915d9f6" (UID: "32ab16b4-e683-4bbe-97b3-6a2c1915d9f6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.810511 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-kube-api-access-t4p6n" (OuterVolumeSpecName: "kube-api-access-t4p6n") pod "32ab16b4-e683-4bbe-97b3-6a2c1915d9f6" (UID: "32ab16b4-e683-4bbe-97b3-6a2c1915d9f6"). InnerVolumeSpecName "kube-api-access-t4p6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.830421 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "32ab16b4-e683-4bbe-97b3-6a2c1915d9f6" (UID: "32ab16b4-e683-4bbe-97b3-6a2c1915d9f6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.883326 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32ab16b4-e683-4bbe-97b3-6a2c1915d9f6" (UID: "32ab16b4-e683-4bbe-97b3-6a2c1915d9f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.905454 4804 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.905494 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.905502 4804 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.905511 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.905519 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4p6n\" (UniqueName: \"kubernetes.io/projected/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-kube-api-access-t4p6n\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:36 crc kubenswrapper[4804]: I0217 13:47:36.919318 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-config-data" (OuterVolumeSpecName: "config-data") pod "32ab16b4-e683-4bbe-97b3-6a2c1915d9f6" (UID: "32ab16b4-e683-4bbe-97b3-6a2c1915d9f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.007481 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.637785 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.669450 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.675865 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.694563 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 13:47:37 crc kubenswrapper[4804]: E0217 13:47:37.695017 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dd4a1b7-336a-4b57-a341-a413ccd8a223" containerName="neutron-api" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.695042 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dd4a1b7-336a-4b57-a341-a413ccd8a223" containerName="neutron-api" Feb 17 13:47:37 crc kubenswrapper[4804]: E0217 13:47:37.695066 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32ab16b4-e683-4bbe-97b3-6a2c1915d9f6" containerName="cinder-scheduler" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.695075 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="32ab16b4-e683-4bbe-97b3-6a2c1915d9f6" containerName="cinder-scheduler" Feb 17 13:47:37 crc kubenswrapper[4804]: E0217 13:47:37.695093 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b85d5058-b075-42ca-8d69-a86cfc1bd01c" containerName="dnsmasq-dns" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.695101 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="b85d5058-b075-42ca-8d69-a86cfc1bd01c" containerName="dnsmasq-dns" Feb 17 13:47:37 crc kubenswrapper[4804]: E0217 13:47:37.695116 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="410af4be-4a66-404d-9809-f58444bc6473" containerName="barbican-api-log" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.695125 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="410af4be-4a66-404d-9809-f58444bc6473" containerName="barbican-api-log" Feb 17 13:47:37 crc kubenswrapper[4804]: E0217 13:47:37.695137 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="410af4be-4a66-404d-9809-f58444bc6473" containerName="barbican-api" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.695144 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="410af4be-4a66-404d-9809-f58444bc6473" containerName="barbican-api" Feb 17 13:47:37 crc kubenswrapper[4804]: E0217 13:47:37.695160 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b85d5058-b075-42ca-8d69-a86cfc1bd01c" containerName="init" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.695168 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="b85d5058-b075-42ca-8d69-a86cfc1bd01c" containerName="init" Feb 17 13:47:37 crc kubenswrapper[4804]: E0217 13:47:37.695179 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32ab16b4-e683-4bbe-97b3-6a2c1915d9f6" containerName="probe" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.695186 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="32ab16b4-e683-4bbe-97b3-6a2c1915d9f6" containerName="probe" Feb 17 13:47:37 crc kubenswrapper[4804]: E0217 13:47:37.695220 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dd4a1b7-336a-4b57-a341-a413ccd8a223" containerName="neutron-httpd" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.695230 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dd4a1b7-336a-4b57-a341-a413ccd8a223" containerName="neutron-httpd" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.695440 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dd4a1b7-336a-4b57-a341-a413ccd8a223" containerName="neutron-httpd" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.695456 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="410af4be-4a66-404d-9809-f58444bc6473" containerName="barbican-api-log" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.695471 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="32ab16b4-e683-4bbe-97b3-6a2c1915d9f6" containerName="cinder-scheduler" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.695482 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dd4a1b7-336a-4b57-a341-a413ccd8a223" containerName="neutron-api" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.695498 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="410af4be-4a66-404d-9809-f58444bc6473" containerName="barbican-api" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.695505 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="32ab16b4-e683-4bbe-97b3-6a2c1915d9f6" containerName="probe" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.695517 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="b85d5058-b075-42ca-8d69-a86cfc1bd01c" containerName="dnsmasq-dns" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.696847 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.701491 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.710047 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.822924 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7170af0-a08f-4b96-b93a-5353d633a82f-scripts\") pod \"cinder-scheduler-0\" (UID: \"f7170af0-a08f-4b96-b93a-5353d633a82f\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.823037 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f7170af0-a08f-4b96-b93a-5353d633a82f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f7170af0-a08f-4b96-b93a-5353d633a82f\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.823095 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f7170af0-a08f-4b96-b93a-5353d633a82f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f7170af0-a08f-4b96-b93a-5353d633a82f\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.823155 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlbsm\" (UniqueName: \"kubernetes.io/projected/f7170af0-a08f-4b96-b93a-5353d633a82f-kube-api-access-rlbsm\") pod \"cinder-scheduler-0\" (UID: \"f7170af0-a08f-4b96-b93a-5353d633a82f\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.823190 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7170af0-a08f-4b96-b93a-5353d633a82f-config-data\") pod \"cinder-scheduler-0\" (UID: \"f7170af0-a08f-4b96-b93a-5353d633a82f\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.823306 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7170af0-a08f-4b96-b93a-5353d633a82f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f7170af0-a08f-4b96-b93a-5353d633a82f\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.925494 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f7170af0-a08f-4b96-b93a-5353d633a82f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f7170af0-a08f-4b96-b93a-5353d633a82f\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.925588 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlbsm\" (UniqueName: \"kubernetes.io/projected/f7170af0-a08f-4b96-b93a-5353d633a82f-kube-api-access-rlbsm\") pod \"cinder-scheduler-0\" (UID: \"f7170af0-a08f-4b96-b93a-5353d633a82f\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.925624 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7170af0-a08f-4b96-b93a-5353d633a82f-config-data\") pod \"cinder-scheduler-0\" (UID: \"f7170af0-a08f-4b96-b93a-5353d633a82f\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.925646 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7170af0-a08f-4b96-b93a-5353d633a82f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f7170af0-a08f-4b96-b93a-5353d633a82f\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.925748 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7170af0-a08f-4b96-b93a-5353d633a82f-scripts\") pod \"cinder-scheduler-0\" (UID: \"f7170af0-a08f-4b96-b93a-5353d633a82f\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.925810 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f7170af0-a08f-4b96-b93a-5353d633a82f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f7170af0-a08f-4b96-b93a-5353d633a82f\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.925915 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f7170af0-a08f-4b96-b93a-5353d633a82f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f7170af0-a08f-4b96-b93a-5353d633a82f\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.931420 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f7170af0-a08f-4b96-b93a-5353d633a82f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f7170af0-a08f-4b96-b93a-5353d633a82f\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.932735 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7170af0-a08f-4b96-b93a-5353d633a82f-config-data\") pod \"cinder-scheduler-0\" (UID: \"f7170af0-a08f-4b96-b93a-5353d633a82f\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.938626 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7170af0-a08f-4b96-b93a-5353d633a82f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f7170af0-a08f-4b96-b93a-5353d633a82f\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.940983 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7170af0-a08f-4b96-b93a-5353d633a82f-scripts\") pod \"cinder-scheduler-0\" (UID: \"f7170af0-a08f-4b96-b93a-5353d633a82f\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:37 crc kubenswrapper[4804]: I0217 13:47:37.945242 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlbsm\" (UniqueName: \"kubernetes.io/projected/f7170af0-a08f-4b96-b93a-5353d633a82f-kube-api-access-rlbsm\") pod \"cinder-scheduler-0\" (UID: \"f7170af0-a08f-4b96-b93a-5353d633a82f\") " pod="openstack/cinder-scheduler-0" Feb 17 13:47:38 crc kubenswrapper[4804]: I0217 13:47:38.021700 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 13:47:38 crc kubenswrapper[4804]: I0217 13:47:38.084906 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-9cc757857-wng6k" Feb 17 13:47:38 crc kubenswrapper[4804]: I0217 13:47:38.282801 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:38 crc kubenswrapper[4804]: I0217 13:47:38.290833 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6d69649784-lnwhw" Feb 17 13:47:38 crc kubenswrapper[4804]: I0217 13:47:38.317563 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:38 crc kubenswrapper[4804]: I0217 13:47:38.398604 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:38 crc kubenswrapper[4804]: I0217 13:47:38.415422 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-67b49bc6f6-6kg64"] Feb 17 13:47:38 crc kubenswrapper[4804]: I0217 13:47:38.594177 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32ab16b4-e683-4bbe-97b3-6a2c1915d9f6" path="/var/lib/kubelet/pods/32ab16b4-e683-4bbe-97b3-6a2c1915d9f6/volumes" Feb 17 13:47:38 crc kubenswrapper[4804]: I0217 13:47:38.616689 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 13:47:38 crc kubenswrapper[4804]: W0217 13:47:38.628510 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7170af0_a08f_4b96_b93a_5353d633a82f.slice/crio-a47273205caf6fe4b177704a64df6be1f6f9979e3f90120c9ee4ef23be7b97b4 WatchSource:0}: Error finding container a47273205caf6fe4b177704a64df6be1f6f9979e3f90120c9ee4ef23be7b97b4: Status 404 returned error can't find the container with id a47273205caf6fe4b177704a64df6be1f6f9979e3f90120c9ee4ef23be7b97b4 Feb 17 13:47:38 crc kubenswrapper[4804]: I0217 13:47:38.645845 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f7170af0-a08f-4b96-b93a-5353d633a82f","Type":"ContainerStarted","Data":"a47273205caf6fe4b177704a64df6be1f6f9979e3f90120c9ee4ef23be7b97b4"} Feb 17 13:47:39 crc kubenswrapper[4804]: I0217 13:47:39.658487 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-67b49bc6f6-6kg64" podUID="8c441055-8615-497e-8754-d107b3be24c7" containerName="placement-log" containerID="cri-o://e068127ef4a35fe1666fe6daae6e3863f5f73c668d7a73942cfafc2c05e0acec" gracePeriod=30 Feb 17 13:47:39 crc kubenswrapper[4804]: I0217 13:47:39.659192 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f7170af0-a08f-4b96-b93a-5353d633a82f","Type":"ContainerStarted","Data":"11c79fb4cb1dc78a4695196d01c8b73f4a5dea4519d6b0c91d65561adadfed48"} Feb 17 13:47:39 crc kubenswrapper[4804]: I0217 13:47:39.659495 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-67b49bc6f6-6kg64" podUID="8c441055-8615-497e-8754-d107b3be24c7" containerName="placement-api" containerID="cri-o://a7e8ab1ab077b94304088259748bb06ecfc3c7584c12f7f6a0f542683df40b3c" gracePeriod=30 Feb 17 13:47:40 crc kubenswrapper[4804]: I0217 13:47:40.446679 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 17 13:47:40 crc kubenswrapper[4804]: I0217 13:47:40.669310 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f7170af0-a08f-4b96-b93a-5353d633a82f","Type":"ContainerStarted","Data":"df5f6549084c31c50fa3a697556bed7273dfc81dfa61851a2013f5aa0e70ef80"} Feb 17 13:47:40 crc kubenswrapper[4804]: I0217 13:47:40.671705 4804 generic.go:334] "Generic (PLEG): container finished" podID="8c441055-8615-497e-8754-d107b3be24c7" containerID="e068127ef4a35fe1666fe6daae6e3863f5f73c668d7a73942cfafc2c05e0acec" exitCode=143 Feb 17 13:47:40 crc kubenswrapper[4804]: I0217 13:47:40.671741 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-67b49bc6f6-6kg64" event={"ID":"8c441055-8615-497e-8754-d107b3be24c7","Type":"ContainerDied","Data":"e068127ef4a35fe1666fe6daae6e3863f5f73c668d7a73942cfafc2c05e0acec"} Feb 17 13:47:40 crc kubenswrapper[4804]: I0217 13:47:40.693640 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.693621127 podStartE2EDuration="3.693621127s" podCreationTimestamp="2026-02-17 13:47:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:40.689600301 +0000 UTC m=+1334.801019668" watchObservedRunningTime="2026-02-17 13:47:40.693621127 +0000 UTC m=+1334.805040464" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.000746 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.006511 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.008811 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-29ss8" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.013452 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.016279 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.019218 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.092258 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"e1213875-d9b5-42f3-ab21-54ea5f12ea7c\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.092450 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-openstack-config-secret\") pod \"openstackclient\" (UID: \"e1213875-d9b5-42f3-ab21-54ea5f12ea7c\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.092699 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd4fn\" (UniqueName: \"kubernetes.io/projected/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-kube-api-access-vd4fn\") pod \"openstackclient\" (UID: \"e1213875-d9b5-42f3-ab21-54ea5f12ea7c\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.092763 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-openstack-config\") pod \"openstackclient\" (UID: \"e1213875-d9b5-42f3-ab21-54ea5f12ea7c\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.194277 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"e1213875-d9b5-42f3-ab21-54ea5f12ea7c\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.194412 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-openstack-config-secret\") pod \"openstackclient\" (UID: \"e1213875-d9b5-42f3-ab21-54ea5f12ea7c\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.194463 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd4fn\" (UniqueName: \"kubernetes.io/projected/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-kube-api-access-vd4fn\") pod \"openstackclient\" (UID: \"e1213875-d9b5-42f3-ab21-54ea5f12ea7c\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.194485 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-openstack-config\") pod \"openstackclient\" (UID: \"e1213875-d9b5-42f3-ab21-54ea5f12ea7c\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.195395 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-openstack-config\") pod \"openstackclient\" (UID: \"e1213875-d9b5-42f3-ab21-54ea5f12ea7c\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.201992 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"e1213875-d9b5-42f3-ab21-54ea5f12ea7c\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.203609 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-openstack-config-secret\") pod \"openstackclient\" (UID: \"e1213875-d9b5-42f3-ab21-54ea5f12ea7c\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.211808 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd4fn\" (UniqueName: \"kubernetes.io/projected/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-kube-api-access-vd4fn\") pod \"openstackclient\" (UID: \"e1213875-d9b5-42f3-ab21-54ea5f12ea7c\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.225041 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.226248 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.236025 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.292400 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.293755 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.301904 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 17 13:47:41 crc kubenswrapper[4804]: E0217 13:47:41.345523 4804 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 17 13:47:41 crc kubenswrapper[4804]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_e1213875-d9b5-42f3-ab21-54ea5f12ea7c_0(b1bf7e425f2cc4d4e5d4b7d977228afe4aeb114fcc55d710b476387a836b3652): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"b1bf7e425f2cc4d4e5d4b7d977228afe4aeb114fcc55d710b476387a836b3652" Netns:"/var/run/netns/616f34e0-73e5-4b7e-a2e0-c50d135f5e9d" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=b1bf7e425f2cc4d4e5d4b7d977228afe4aeb114fcc55d710b476387a836b3652;K8S_POD_UID=e1213875-d9b5-42f3-ab21-54ea5f12ea7c" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/e1213875-d9b5-42f3-ab21-54ea5f12ea7c]: expected pod UID "e1213875-d9b5-42f3-ab21-54ea5f12ea7c" but got "de1a53e3-68ce-4ecd-9c0a-80ffce568891" from Kube API Feb 17 13:47:41 crc kubenswrapper[4804]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 17 13:47:41 crc kubenswrapper[4804]: > Feb 17 13:47:41 crc kubenswrapper[4804]: E0217 13:47:41.345610 4804 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 17 13:47:41 crc kubenswrapper[4804]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_e1213875-d9b5-42f3-ab21-54ea5f12ea7c_0(b1bf7e425f2cc4d4e5d4b7d977228afe4aeb114fcc55d710b476387a836b3652): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"b1bf7e425f2cc4d4e5d4b7d977228afe4aeb114fcc55d710b476387a836b3652" Netns:"/var/run/netns/616f34e0-73e5-4b7e-a2e0-c50d135f5e9d" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=b1bf7e425f2cc4d4e5d4b7d977228afe4aeb114fcc55d710b476387a836b3652;K8S_POD_UID=e1213875-d9b5-42f3-ab21-54ea5f12ea7c" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/e1213875-d9b5-42f3-ab21-54ea5f12ea7c]: expected pod UID "e1213875-d9b5-42f3-ab21-54ea5f12ea7c" but got "de1a53e3-68ce-4ecd-9c0a-80ffce568891" from Kube API Feb 17 13:47:41 crc kubenswrapper[4804]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 17 13:47:41 crc kubenswrapper[4804]: > pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.397751 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de1a53e3-68ce-4ecd-9c0a-80ffce568891-combined-ca-bundle\") pod \"openstackclient\" (UID: \"de1a53e3-68ce-4ecd-9c0a-80ffce568891\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.397807 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/de1a53e3-68ce-4ecd-9c0a-80ffce568891-openstack-config-secret\") pod \"openstackclient\" (UID: \"de1a53e3-68ce-4ecd-9c0a-80ffce568891\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.397918 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/de1a53e3-68ce-4ecd-9c0a-80ffce568891-openstack-config\") pod \"openstackclient\" (UID: \"de1a53e3-68ce-4ecd-9c0a-80ffce568891\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.398043 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfjjx\" (UniqueName: \"kubernetes.io/projected/de1a53e3-68ce-4ecd-9c0a-80ffce568891-kube-api-access-hfjjx\") pod \"openstackclient\" (UID: \"de1a53e3-68ce-4ecd-9c0a-80ffce568891\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.499834 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfjjx\" (UniqueName: \"kubernetes.io/projected/de1a53e3-68ce-4ecd-9c0a-80ffce568891-kube-api-access-hfjjx\") pod \"openstackclient\" (UID: \"de1a53e3-68ce-4ecd-9c0a-80ffce568891\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.499927 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de1a53e3-68ce-4ecd-9c0a-80ffce568891-combined-ca-bundle\") pod \"openstackclient\" (UID: \"de1a53e3-68ce-4ecd-9c0a-80ffce568891\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.499960 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/de1a53e3-68ce-4ecd-9c0a-80ffce568891-openstack-config-secret\") pod \"openstackclient\" (UID: \"de1a53e3-68ce-4ecd-9c0a-80ffce568891\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.500030 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/de1a53e3-68ce-4ecd-9c0a-80ffce568891-openstack-config\") pod \"openstackclient\" (UID: \"de1a53e3-68ce-4ecd-9c0a-80ffce568891\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.506122 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de1a53e3-68ce-4ecd-9c0a-80ffce568891-combined-ca-bundle\") pod \"openstackclient\" (UID: \"de1a53e3-68ce-4ecd-9c0a-80ffce568891\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.507223 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/de1a53e3-68ce-4ecd-9c0a-80ffce568891-openstack-config\") pod \"openstackclient\" (UID: \"de1a53e3-68ce-4ecd-9c0a-80ffce568891\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.510699 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/de1a53e3-68ce-4ecd-9c0a-80ffce568891-openstack-config-secret\") pod \"openstackclient\" (UID: \"de1a53e3-68ce-4ecd-9c0a-80ffce568891\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.525764 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfjjx\" (UniqueName: \"kubernetes.io/projected/de1a53e3-68ce-4ecd-9c0a-80ffce568891-kube-api-access-hfjjx\") pod \"openstackclient\" (UID: \"de1a53e3-68ce-4ecd-9c0a-80ffce568891\") " pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.683764 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.687590 4804 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="e1213875-d9b5-42f3-ab21-54ea5f12ea7c" podUID="de1a53e3-68ce-4ecd-9c0a-80ffce568891" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.694334 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.695129 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.812612 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-openstack-config-secret\") pod \"e1213875-d9b5-42f3-ab21-54ea5f12ea7c\" (UID: \"e1213875-d9b5-42f3-ab21-54ea5f12ea7c\") " Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.814497 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-openstack-config\") pod \"e1213875-d9b5-42f3-ab21-54ea5f12ea7c\" (UID: \"e1213875-d9b5-42f3-ab21-54ea5f12ea7c\") " Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.814629 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vd4fn\" (UniqueName: \"kubernetes.io/projected/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-kube-api-access-vd4fn\") pod \"e1213875-d9b5-42f3-ab21-54ea5f12ea7c\" (UID: \"e1213875-d9b5-42f3-ab21-54ea5f12ea7c\") " Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.814677 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-combined-ca-bundle\") pod \"e1213875-d9b5-42f3-ab21-54ea5f12ea7c\" (UID: \"e1213875-d9b5-42f3-ab21-54ea5f12ea7c\") " Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.815105 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "e1213875-d9b5-42f3-ab21-54ea5f12ea7c" (UID: "e1213875-d9b5-42f3-ab21-54ea5f12ea7c"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.815467 4804 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.818602 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-kube-api-access-vd4fn" (OuterVolumeSpecName: "kube-api-access-vd4fn") pod "e1213875-d9b5-42f3-ab21-54ea5f12ea7c" (UID: "e1213875-d9b5-42f3-ab21-54ea5f12ea7c"). InnerVolumeSpecName "kube-api-access-vd4fn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.819424 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "e1213875-d9b5-42f3-ab21-54ea5f12ea7c" (UID: "e1213875-d9b5-42f3-ab21-54ea5f12ea7c"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.823340 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1213875-d9b5-42f3-ab21-54ea5f12ea7c" (UID: "e1213875-d9b5-42f3-ab21-54ea5f12ea7c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.917318 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vd4fn\" (UniqueName: \"kubernetes.io/projected/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-kube-api-access-vd4fn\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.917351 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:41 crc kubenswrapper[4804]: I0217 13:47:41.917359 4804 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e1213875-d9b5-42f3-ab21-54ea5f12ea7c-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:42 crc kubenswrapper[4804]: I0217 13:47:42.193273 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 17 13:47:42 crc kubenswrapper[4804]: W0217 13:47:42.194315 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde1a53e3_68ce_4ecd_9c0a_80ffce568891.slice/crio-6d7072130547b3c16c17592010f193215c21a3cb4e8a04f7361f9ea5ef5a247f WatchSource:0}: Error finding container 6d7072130547b3c16c17592010f193215c21a3cb4e8a04f7361f9ea5ef5a247f: Status 404 returned error can't find the container with id 6d7072130547b3c16c17592010f193215c21a3cb4e8a04f7361f9ea5ef5a247f Feb 17 13:47:42 crc kubenswrapper[4804]: I0217 13:47:42.584320 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1213875-d9b5-42f3-ab21-54ea5f12ea7c" path="/var/lib/kubelet/pods/e1213875-d9b5-42f3-ab21-54ea5f12ea7c/volumes" Feb 17 13:47:42 crc kubenswrapper[4804]: I0217 13:47:42.705341 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 13:47:42 crc kubenswrapper[4804]: I0217 13:47:42.705333 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"de1a53e3-68ce-4ecd-9c0a-80ffce568891","Type":"ContainerStarted","Data":"6d7072130547b3c16c17592010f193215c21a3cb4e8a04f7361f9ea5ef5a247f"} Feb 17 13:47:42 crc kubenswrapper[4804]: I0217 13:47:42.713854 4804 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="e1213875-d9b5-42f3-ab21-54ea5f12ea7c" podUID="de1a53e3-68ce-4ecd-9c0a-80ffce568891" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.022712 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.260092 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.343101 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-config-data\") pod \"8c441055-8615-497e-8754-d107b3be24c7\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.343158 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-combined-ca-bundle\") pod \"8c441055-8615-497e-8754-d107b3be24c7\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.343264 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c441055-8615-497e-8754-d107b3be24c7-logs\") pod \"8c441055-8615-497e-8754-d107b3be24c7\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.343414 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-public-tls-certs\") pod \"8c441055-8615-497e-8754-d107b3be24c7\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.343439 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsgq9\" (UniqueName: \"kubernetes.io/projected/8c441055-8615-497e-8754-d107b3be24c7-kube-api-access-zsgq9\") pod \"8c441055-8615-497e-8754-d107b3be24c7\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.343509 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-scripts\") pod \"8c441055-8615-497e-8754-d107b3be24c7\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.343581 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-internal-tls-certs\") pod \"8c441055-8615-497e-8754-d107b3be24c7\" (UID: \"8c441055-8615-497e-8754-d107b3be24c7\") " Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.344710 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c441055-8615-497e-8754-d107b3be24c7-logs" (OuterVolumeSpecName: "logs") pod "8c441055-8615-497e-8754-d107b3be24c7" (UID: "8c441055-8615-497e-8754-d107b3be24c7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.350872 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c441055-8615-497e-8754-d107b3be24c7-kube-api-access-zsgq9" (OuterVolumeSpecName: "kube-api-access-zsgq9") pod "8c441055-8615-497e-8754-d107b3be24c7" (UID: "8c441055-8615-497e-8754-d107b3be24c7"). InnerVolumeSpecName "kube-api-access-zsgq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.352031 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-scripts" (OuterVolumeSpecName: "scripts") pod "8c441055-8615-497e-8754-d107b3be24c7" (UID: "8c441055-8615-497e-8754-d107b3be24c7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.428505 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c441055-8615-497e-8754-d107b3be24c7" (UID: "8c441055-8615-497e-8754-d107b3be24c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.446417 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.446455 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.446468 4804 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c441055-8615-497e-8754-d107b3be24c7-logs\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.446478 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsgq9\" (UniqueName: \"kubernetes.io/projected/8c441055-8615-497e-8754-d107b3be24c7-kube-api-access-zsgq9\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.461604 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-config-data" (OuterVolumeSpecName: "config-data") pod "8c441055-8615-497e-8754-d107b3be24c7" (UID: "8c441055-8615-497e-8754-d107b3be24c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.472301 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8c441055-8615-497e-8754-d107b3be24c7" (UID: "8c441055-8615-497e-8754-d107b3be24c7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.481351 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8c441055-8615-497e-8754-d107b3be24c7" (UID: "8c441055-8615-497e-8754-d107b3be24c7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.548459 4804 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.548488 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.548496 4804 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c441055-8615-497e-8754-d107b3be24c7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.718274 4804 generic.go:334] "Generic (PLEG): container finished" podID="8c441055-8615-497e-8754-d107b3be24c7" containerID="a7e8ab1ab077b94304088259748bb06ecfc3c7584c12f7f6a0f542683df40b3c" exitCode=0 Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.718337 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-67b49bc6f6-6kg64" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.718329 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-67b49bc6f6-6kg64" event={"ID":"8c441055-8615-497e-8754-d107b3be24c7","Type":"ContainerDied","Data":"a7e8ab1ab077b94304088259748bb06ecfc3c7584c12f7f6a0f542683df40b3c"} Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.718385 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-67b49bc6f6-6kg64" event={"ID":"8c441055-8615-497e-8754-d107b3be24c7","Type":"ContainerDied","Data":"ba0cc2230bff6e65b06b28d38b4ed605a390c7cff5692c7c90bbd33cf0934ef3"} Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.718405 4804 scope.go:117] "RemoveContainer" containerID="a7e8ab1ab077b94304088259748bb06ecfc3c7584c12f7f6a0f542683df40b3c" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.742782 4804 scope.go:117] "RemoveContainer" containerID="e068127ef4a35fe1666fe6daae6e3863f5f73c668d7a73942cfafc2c05e0acec" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.763612 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-67b49bc6f6-6kg64"] Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.770916 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-67b49bc6f6-6kg64"] Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.775158 4804 scope.go:117] "RemoveContainer" containerID="a7e8ab1ab077b94304088259748bb06ecfc3c7584c12f7f6a0f542683df40b3c" Feb 17 13:47:43 crc kubenswrapper[4804]: E0217 13:47:43.775628 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7e8ab1ab077b94304088259748bb06ecfc3c7584c12f7f6a0f542683df40b3c\": container with ID starting with a7e8ab1ab077b94304088259748bb06ecfc3c7584c12f7f6a0f542683df40b3c not found: ID does not exist" containerID="a7e8ab1ab077b94304088259748bb06ecfc3c7584c12f7f6a0f542683df40b3c" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.775662 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7e8ab1ab077b94304088259748bb06ecfc3c7584c12f7f6a0f542683df40b3c"} err="failed to get container status \"a7e8ab1ab077b94304088259748bb06ecfc3c7584c12f7f6a0f542683df40b3c\": rpc error: code = NotFound desc = could not find container \"a7e8ab1ab077b94304088259748bb06ecfc3c7584c12f7f6a0f542683df40b3c\": container with ID starting with a7e8ab1ab077b94304088259748bb06ecfc3c7584c12f7f6a0f542683df40b3c not found: ID does not exist" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.775684 4804 scope.go:117] "RemoveContainer" containerID="e068127ef4a35fe1666fe6daae6e3863f5f73c668d7a73942cfafc2c05e0acec" Feb 17 13:47:43 crc kubenswrapper[4804]: E0217 13:47:43.775918 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e068127ef4a35fe1666fe6daae6e3863f5f73c668d7a73942cfafc2c05e0acec\": container with ID starting with e068127ef4a35fe1666fe6daae6e3863f5f73c668d7a73942cfafc2c05e0acec not found: ID does not exist" containerID="e068127ef4a35fe1666fe6daae6e3863f5f73c668d7a73942cfafc2c05e0acec" Feb 17 13:47:43 crc kubenswrapper[4804]: I0217 13:47:43.775942 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e068127ef4a35fe1666fe6daae6e3863f5f73c668d7a73942cfafc2c05e0acec"} err="failed to get container status \"e068127ef4a35fe1666fe6daae6e3863f5f73c668d7a73942cfafc2c05e0acec\": rpc error: code = NotFound desc = could not find container \"e068127ef4a35fe1666fe6daae6e3863f5f73c668d7a73942cfafc2c05e0acec\": container with ID starting with e068127ef4a35fe1666fe6daae6e3863f5f73c668d7a73942cfafc2c05e0acec not found: ID does not exist" Feb 17 13:47:44 crc kubenswrapper[4804]: I0217 13:47:44.584927 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c441055-8615-497e-8754-d107b3be24c7" path="/var/lib/kubelet/pods/8c441055-8615-497e-8754-d107b3be24c7/volumes" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.553821 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-59cfdfc65f-48l6n"] Feb 17 13:47:45 crc kubenswrapper[4804]: E0217 13:47:45.570925 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c441055-8615-497e-8754-d107b3be24c7" containerName="placement-log" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.570967 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c441055-8615-497e-8754-d107b3be24c7" containerName="placement-log" Feb 17 13:47:45 crc kubenswrapper[4804]: E0217 13:47:45.570983 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c441055-8615-497e-8754-d107b3be24c7" containerName="placement-api" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.570990 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c441055-8615-497e-8754-d107b3be24c7" containerName="placement-api" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.571192 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c441055-8615-497e-8754-d107b3be24c7" containerName="placement-api" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.571226 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c441055-8615-497e-8754-d107b3be24c7" containerName="placement-log" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.572185 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.575764 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.576891 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.577008 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.580297 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-59cfdfc65f-48l6n"] Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.691391 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlfbb\" (UniqueName: \"kubernetes.io/projected/be0372d3-4646-46e7-af04-6977a7426f35-kube-api-access-hlfbb\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.691581 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be0372d3-4646-46e7-af04-6977a7426f35-internal-tls-certs\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.691714 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be0372d3-4646-46e7-af04-6977a7426f35-run-httpd\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.691769 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be0372d3-4646-46e7-af04-6977a7426f35-public-tls-certs\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.691797 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be0372d3-4646-46e7-af04-6977a7426f35-log-httpd\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.691900 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be0372d3-4646-46e7-af04-6977a7426f35-config-data\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.692362 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be0372d3-4646-46e7-af04-6977a7426f35-combined-ca-bundle\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.692434 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/be0372d3-4646-46e7-af04-6977a7426f35-etc-swift\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.794169 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be0372d3-4646-46e7-af04-6977a7426f35-internal-tls-certs\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.794277 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be0372d3-4646-46e7-af04-6977a7426f35-run-httpd\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.794302 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be0372d3-4646-46e7-af04-6977a7426f35-public-tls-certs\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.794319 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be0372d3-4646-46e7-af04-6977a7426f35-log-httpd\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.794361 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be0372d3-4646-46e7-af04-6977a7426f35-config-data\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.794383 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be0372d3-4646-46e7-af04-6977a7426f35-combined-ca-bundle\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.794402 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/be0372d3-4646-46e7-af04-6977a7426f35-etc-swift\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.794437 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlfbb\" (UniqueName: \"kubernetes.io/projected/be0372d3-4646-46e7-af04-6977a7426f35-kube-api-access-hlfbb\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.796615 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be0372d3-4646-46e7-af04-6977a7426f35-run-httpd\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.799329 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be0372d3-4646-46e7-af04-6977a7426f35-log-httpd\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.801270 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be0372d3-4646-46e7-af04-6977a7426f35-public-tls-certs\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.803360 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be0372d3-4646-46e7-af04-6977a7426f35-internal-tls-certs\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.803505 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be0372d3-4646-46e7-af04-6977a7426f35-combined-ca-bundle\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.808528 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be0372d3-4646-46e7-af04-6977a7426f35-config-data\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.817112 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlfbb\" (UniqueName: \"kubernetes.io/projected/be0372d3-4646-46e7-af04-6977a7426f35-kube-api-access-hlfbb\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.817853 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/be0372d3-4646-46e7-af04-6977a7426f35-etc-swift\") pod \"swift-proxy-59cfdfc65f-48l6n\" (UID: \"be0372d3-4646-46e7-af04-6977a7426f35\") " pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.869753 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.870027 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="da1535f5-a225-489d-af6d-cbfa6042d239" containerName="ceilometer-central-agent" containerID="cri-o://1aabbda01fd4eb2f80fc3bbf09b7a922e4856861abbe2a6d98b91344740bf141" gracePeriod=30 Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.870048 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="da1535f5-a225-489d-af6d-cbfa6042d239" containerName="proxy-httpd" containerID="cri-o://35ecd78bd9969e0d81d74d255802d84e8782104abf917998c0f5c9f7a1d3435a" gracePeriod=30 Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.870137 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="da1535f5-a225-489d-af6d-cbfa6042d239" containerName="ceilometer-notification-agent" containerID="cri-o://092874b0b3e14392b931fcb3901d9071706161752bbb5877b56ec700010be97b" gracePeriod=30 Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.870096 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="da1535f5-a225-489d-af6d-cbfa6042d239" containerName="sg-core" containerID="cri-o://99fe5ef4d5a27697bd3d835ca4e7242c9ea11c1b7e9ff93b4ae3d3d3447f90ca" gracePeriod=30 Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.889945 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="da1535f5-a225-489d-af6d-cbfa6042d239" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.168:3000/\": EOF" Feb 17 13:47:45 crc kubenswrapper[4804]: I0217 13:47:45.893730 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:46 crc kubenswrapper[4804]: I0217 13:47:46.752871 4804 generic.go:334] "Generic (PLEG): container finished" podID="da1535f5-a225-489d-af6d-cbfa6042d239" containerID="35ecd78bd9969e0d81d74d255802d84e8782104abf917998c0f5c9f7a1d3435a" exitCode=0 Feb 17 13:47:46 crc kubenswrapper[4804]: I0217 13:47:46.753158 4804 generic.go:334] "Generic (PLEG): container finished" podID="da1535f5-a225-489d-af6d-cbfa6042d239" containerID="99fe5ef4d5a27697bd3d835ca4e7242c9ea11c1b7e9ff93b4ae3d3d3447f90ca" exitCode=2 Feb 17 13:47:46 crc kubenswrapper[4804]: I0217 13:47:46.753170 4804 generic.go:334] "Generic (PLEG): container finished" podID="da1535f5-a225-489d-af6d-cbfa6042d239" containerID="1aabbda01fd4eb2f80fc3bbf09b7a922e4856861abbe2a6d98b91344740bf141" exitCode=0 Feb 17 13:47:46 crc kubenswrapper[4804]: I0217 13:47:46.752964 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da1535f5-a225-489d-af6d-cbfa6042d239","Type":"ContainerDied","Data":"35ecd78bd9969e0d81d74d255802d84e8782104abf917998c0f5c9f7a1d3435a"} Feb 17 13:47:46 crc kubenswrapper[4804]: I0217 13:47:46.753235 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da1535f5-a225-489d-af6d-cbfa6042d239","Type":"ContainerDied","Data":"99fe5ef4d5a27697bd3d835ca4e7242c9ea11c1b7e9ff93b4ae3d3d3447f90ca"} Feb 17 13:47:46 crc kubenswrapper[4804]: I0217 13:47:46.753253 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da1535f5-a225-489d-af6d-cbfa6042d239","Type":"ContainerDied","Data":"1aabbda01fd4eb2f80fc3bbf09b7a922e4856861abbe2a6d98b91344740bf141"} Feb 17 13:47:48 crc kubenswrapper[4804]: I0217 13:47:48.344360 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 17 13:47:50 crc kubenswrapper[4804]: I0217 13:47:50.222676 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:47:50 crc kubenswrapper[4804]: I0217 13:47:50.224454 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="185b3c31-7ccc-4f8d-bcb1-20cabbf50943" containerName="glance-log" containerID="cri-o://953b973c887ab4f0021ac7303d1d54237d7d43957bc03fe8e6222019de4450e9" gracePeriod=30 Feb 17 13:47:50 crc kubenswrapper[4804]: I0217 13:47:50.224522 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="185b3c31-7ccc-4f8d-bcb1-20cabbf50943" containerName="glance-httpd" containerID="cri-o://b16be4415372b664ded381a846b14c2c2406261a1bf459398d286a236434a0be" gracePeriod=30 Feb 17 13:47:50 crc kubenswrapper[4804]: I0217 13:47:50.799053 4804 generic.go:334] "Generic (PLEG): container finished" podID="da1535f5-a225-489d-af6d-cbfa6042d239" containerID="092874b0b3e14392b931fcb3901d9071706161752bbb5877b56ec700010be97b" exitCode=0 Feb 17 13:47:50 crc kubenswrapper[4804]: I0217 13:47:50.799110 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da1535f5-a225-489d-af6d-cbfa6042d239","Type":"ContainerDied","Data":"092874b0b3e14392b931fcb3901d9071706161752bbb5877b56ec700010be97b"} Feb 17 13:47:50 crc kubenswrapper[4804]: I0217 13:47:50.801740 4804 generic.go:334] "Generic (PLEG): container finished" podID="185b3c31-7ccc-4f8d-bcb1-20cabbf50943" containerID="953b973c887ab4f0021ac7303d1d54237d7d43957bc03fe8e6222019de4450e9" exitCode=143 Feb 17 13:47:50 crc kubenswrapper[4804]: I0217 13:47:50.801783 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"185b3c31-7ccc-4f8d-bcb1-20cabbf50943","Type":"ContainerDied","Data":"953b973c887ab4f0021ac7303d1d54237d7d43957bc03fe8e6222019de4450e9"} Feb 17 13:47:51 crc kubenswrapper[4804]: I0217 13:47:51.246480 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 13:47:51 crc kubenswrapper[4804]: I0217 13:47:51.247022 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8ec519a7-9081-4341-ad6c-c81dda70bd3a" containerName="glance-log" containerID="cri-o://3c023d82e32da3e66e3f80b40ff960f9faffbfd6b13149e23d95974790def49f" gracePeriod=30 Feb 17 13:47:51 crc kubenswrapper[4804]: I0217 13:47:51.247502 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8ec519a7-9081-4341-ad6c-c81dda70bd3a" containerName="glance-httpd" containerID="cri-o://74e9c41b66fa02c3d94931a9817572fd799183a1707f607e072d3c3dddd9e96b" gracePeriod=30 Feb 17 13:47:51 crc kubenswrapper[4804]: I0217 13:47:51.813263 4804 generic.go:334] "Generic (PLEG): container finished" podID="8ec519a7-9081-4341-ad6c-c81dda70bd3a" containerID="3c023d82e32da3e66e3f80b40ff960f9faffbfd6b13149e23d95974790def49f" exitCode=143 Feb 17 13:47:51 crc kubenswrapper[4804]: I0217 13:47:51.813348 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ec519a7-9081-4341-ad6c-c81dda70bd3a","Type":"ContainerDied","Data":"3c023d82e32da3e66e3f80b40ff960f9faffbfd6b13149e23d95974790def49f"} Feb 17 13:47:51 crc kubenswrapper[4804]: I0217 13:47:51.861626 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-59cfdfc65f-48l6n"] Feb 17 13:47:51 crc kubenswrapper[4804]: W0217 13:47:51.905708 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe0372d3_4646_46e7_af04_6977a7426f35.slice/crio-01f02406beff5b8f1dc14a51f7818ca7fe7b3803580328438da33f9f2c184b0e WatchSource:0}: Error finding container 01f02406beff5b8f1dc14a51f7818ca7fe7b3803580328438da33f9f2c184b0e: Status 404 returned error can't find the container with id 01f02406beff5b8f1dc14a51f7818ca7fe7b3803580328438da33f9f2c184b0e Feb 17 13:47:51 crc kubenswrapper[4804]: I0217 13:47:51.942003 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.010737 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-scripts\") pod \"da1535f5-a225-489d-af6d-cbfa6042d239\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.012805 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljpm8\" (UniqueName: \"kubernetes.io/projected/da1535f5-a225-489d-af6d-cbfa6042d239-kube-api-access-ljpm8\") pod \"da1535f5-a225-489d-af6d-cbfa6042d239\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.013058 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da1535f5-a225-489d-af6d-cbfa6042d239-run-httpd\") pod \"da1535f5-a225-489d-af6d-cbfa6042d239\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.013474 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da1535f5-a225-489d-af6d-cbfa6042d239-log-httpd\") pod \"da1535f5-a225-489d-af6d-cbfa6042d239\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.014308 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-sg-core-conf-yaml\") pod \"da1535f5-a225-489d-af6d-cbfa6042d239\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.014489 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-combined-ca-bundle\") pod \"da1535f5-a225-489d-af6d-cbfa6042d239\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.014617 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-config-data\") pod \"da1535f5-a225-489d-af6d-cbfa6042d239\" (UID: \"da1535f5-a225-489d-af6d-cbfa6042d239\") " Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.013632 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da1535f5-a225-489d-af6d-cbfa6042d239-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "da1535f5-a225-489d-af6d-cbfa6042d239" (UID: "da1535f5-a225-489d-af6d-cbfa6042d239"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.013737 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-scripts" (OuterVolumeSpecName: "scripts") pod "da1535f5-a225-489d-af6d-cbfa6042d239" (UID: "da1535f5-a225-489d-af6d-cbfa6042d239"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.014130 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da1535f5-a225-489d-af6d-cbfa6042d239-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "da1535f5-a225-489d-af6d-cbfa6042d239" (UID: "da1535f5-a225-489d-af6d-cbfa6042d239"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.016298 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da1535f5-a225-489d-af6d-cbfa6042d239-kube-api-access-ljpm8" (OuterVolumeSpecName: "kube-api-access-ljpm8") pod "da1535f5-a225-489d-af6d-cbfa6042d239" (UID: "da1535f5-a225-489d-af6d-cbfa6042d239"). InnerVolumeSpecName "kube-api-access-ljpm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.070430 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "da1535f5-a225-489d-af6d-cbfa6042d239" (UID: "da1535f5-a225-489d-af6d-cbfa6042d239"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.117430 4804 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.117458 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.117470 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljpm8\" (UniqueName: \"kubernetes.io/projected/da1535f5-a225-489d-af6d-cbfa6042d239-kube-api-access-ljpm8\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.117482 4804 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da1535f5-a225-489d-af6d-cbfa6042d239-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.117492 4804 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da1535f5-a225-489d-af6d-cbfa6042d239-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.158663 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da1535f5-a225-489d-af6d-cbfa6042d239" (UID: "da1535f5-a225-489d-af6d-cbfa6042d239"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.168505 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-config-data" (OuterVolumeSpecName: "config-data") pod "da1535f5-a225-489d-af6d-cbfa6042d239" (UID: "da1535f5-a225-489d-af6d-cbfa6042d239"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.219689 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.219730 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da1535f5-a225-489d-af6d-cbfa6042d239-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.821631 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"de1a53e3-68ce-4ecd-9c0a-80ffce568891","Type":"ContainerStarted","Data":"84f3b28dc2b3056e3fb5313ca3e48c8ca136793dc253000eb641f73f1de2f9a8"} Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.825099 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da1535f5-a225-489d-af6d-cbfa6042d239","Type":"ContainerDied","Data":"4b8e0eba24a3942ba5514c36c6f889a4738be910f9bdb4e385c1915be330d79c"} Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.825146 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.825216 4804 scope.go:117] "RemoveContainer" containerID="35ecd78bd9969e0d81d74d255802d84e8782104abf917998c0f5c9f7a1d3435a" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.827241 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-59cfdfc65f-48l6n" event={"ID":"be0372d3-4646-46e7-af04-6977a7426f35","Type":"ContainerStarted","Data":"9cbe66341df3b9590a351bd7c02ac11b961aa3da92b98d61d1d34240c4563e86"} Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.827274 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-59cfdfc65f-48l6n" event={"ID":"be0372d3-4646-46e7-af04-6977a7426f35","Type":"ContainerStarted","Data":"c8c0f66b6cbbe67c8aab35aeb07fc30545c1b5f65ed15816fa40b4ca13f37252"} Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.827288 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-59cfdfc65f-48l6n" event={"ID":"be0372d3-4646-46e7-af04-6977a7426f35","Type":"ContainerStarted","Data":"01f02406beff5b8f1dc14a51f7818ca7fe7b3803580328438da33f9f2c184b0e"} Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.827789 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.827882 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.845545 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.055751727 podStartE2EDuration="11.845521974s" podCreationTimestamp="2026-02-17 13:47:41 +0000 UTC" firstStartedPulling="2026-02-17 13:47:42.196969092 +0000 UTC m=+1336.308388439" lastFinishedPulling="2026-02-17 13:47:51.986739349 +0000 UTC m=+1346.098158686" observedRunningTime="2026-02-17 13:47:52.840608729 +0000 UTC m=+1346.952028066" watchObservedRunningTime="2026-02-17 13:47:52.845521974 +0000 UTC m=+1346.956941321" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.855174 4804 scope.go:117] "RemoveContainer" containerID="99fe5ef4d5a27697bd3d835ca4e7242c9ea11c1b7e9ff93b4ae3d3d3447f90ca" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.872898 4804 scope.go:117] "RemoveContainer" containerID="092874b0b3e14392b931fcb3901d9071706161752bbb5877b56ec700010be97b" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.883167 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-59cfdfc65f-48l6n" podStartSLOduration=7.883151647 podStartE2EDuration="7.883151647s" podCreationTimestamp="2026-02-17 13:47:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:52.875661602 +0000 UTC m=+1346.987080939" watchObservedRunningTime="2026-02-17 13:47:52.883151647 +0000 UTC m=+1346.994570984" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.904136 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.905446 4804 scope.go:117] "RemoveContainer" containerID="1aabbda01fd4eb2f80fc3bbf09b7a922e4856861abbe2a6d98b91344740bf141" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.920603 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.940748 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:47:52 crc kubenswrapper[4804]: E0217 13:47:52.941270 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da1535f5-a225-489d-af6d-cbfa6042d239" containerName="sg-core" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.941305 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="da1535f5-a225-489d-af6d-cbfa6042d239" containerName="sg-core" Feb 17 13:47:52 crc kubenswrapper[4804]: E0217 13:47:52.941323 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da1535f5-a225-489d-af6d-cbfa6042d239" containerName="ceilometer-central-agent" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.941332 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="da1535f5-a225-489d-af6d-cbfa6042d239" containerName="ceilometer-central-agent" Feb 17 13:47:52 crc kubenswrapper[4804]: E0217 13:47:52.941354 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da1535f5-a225-489d-af6d-cbfa6042d239" containerName="proxy-httpd" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.941361 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="da1535f5-a225-489d-af6d-cbfa6042d239" containerName="proxy-httpd" Feb 17 13:47:52 crc kubenswrapper[4804]: E0217 13:47:52.941397 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da1535f5-a225-489d-af6d-cbfa6042d239" containerName="ceilometer-notification-agent" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.941403 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="da1535f5-a225-489d-af6d-cbfa6042d239" containerName="ceilometer-notification-agent" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.941600 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="da1535f5-a225-489d-af6d-cbfa6042d239" containerName="ceilometer-central-agent" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.941615 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="da1535f5-a225-489d-af6d-cbfa6042d239" containerName="ceilometer-notification-agent" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.941624 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="da1535f5-a225-489d-af6d-cbfa6042d239" containerName="sg-core" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.941645 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="da1535f5-a225-489d-af6d-cbfa6042d239" containerName="proxy-httpd" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.947960 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.949313 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.956936 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 13:47:52 crc kubenswrapper[4804]: I0217 13:47:52.957544 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.034075 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-scripts\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.034368 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.034440 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.034489 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-config-data\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.034510 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-run-httpd\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.034546 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-log-httpd\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.034572 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr5gm\" (UniqueName: \"kubernetes.io/projected/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-kube-api-access-hr5gm\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.136004 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-config-data\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.136051 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-run-httpd\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.136099 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-log-httpd\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.136129 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr5gm\" (UniqueName: \"kubernetes.io/projected/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-kube-api-access-hr5gm\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.136186 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-scripts\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.136235 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.136299 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.137721 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-run-httpd\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.141695 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.141985 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-log-httpd\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.142945 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.143340 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-config-data\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.149703 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-scripts\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.164187 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr5gm\" (UniqueName: \"kubernetes.io/projected/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-kube-api-access-hr5gm\") pod \"ceilometer-0\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.277382 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.567004 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-582lj"] Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.570066 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-582lj" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.590892 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-582lj"] Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.670255 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-570c-account-create-update-48hmw"] Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.674574 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1517f905-d980-43be-8583-f1a40170752e-operator-scripts\") pod \"nova-api-db-create-582lj\" (UID: \"1517f905-d980-43be-8583-f1a40170752e\") " pod="openstack/nova-api-db-create-582lj" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.674700 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjp5b\" (UniqueName: \"kubernetes.io/projected/1517f905-d980-43be-8583-f1a40170752e-kube-api-access-rjp5b\") pod \"nova-api-db-create-582lj\" (UID: \"1517f905-d980-43be-8583-f1a40170752e\") " pod="openstack/nova-api-db-create-582lj" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.675096 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-570c-account-create-update-48hmw" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.681548 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.704869 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-570c-account-create-update-48hmw"] Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.732510 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-6h6dp"] Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.734022 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-6h6dp" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.752856 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-6h6dp"] Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.777833 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjp5b\" (UniqueName: \"kubernetes.io/projected/1517f905-d980-43be-8583-f1a40170752e-kube-api-access-rjp5b\") pod \"nova-api-db-create-582lj\" (UID: \"1517f905-d980-43be-8583-f1a40170752e\") " pod="openstack/nova-api-db-create-582lj" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.778130 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1517f905-d980-43be-8583-f1a40170752e-operator-scripts\") pod \"nova-api-db-create-582lj\" (UID: \"1517f905-d980-43be-8583-f1a40170752e\") " pod="openstack/nova-api-db-create-582lj" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.778266 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ccb316de-cd6e-4f79-9387-81f7a8add771-operator-scripts\") pod \"nova-api-570c-account-create-update-48hmw\" (UID: \"ccb316de-cd6e-4f79-9387-81f7a8add771\") " pod="openstack/nova-api-570c-account-create-update-48hmw" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.778446 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2rgm\" (UniqueName: \"kubernetes.io/projected/ccb316de-cd6e-4f79-9387-81f7a8add771-kube-api-access-p2rgm\") pod \"nova-api-570c-account-create-update-48hmw\" (UID: \"ccb316de-cd6e-4f79-9387-81f7a8add771\") " pod="openstack/nova-api-570c-account-create-update-48hmw" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.779702 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1517f905-d980-43be-8583-f1a40170752e-operator-scripts\") pod \"nova-api-db-create-582lj\" (UID: \"1517f905-d980-43be-8583-f1a40170752e\") " pod="openstack/nova-api-db-create-582lj" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.803067 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-nn6tq"] Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.804546 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nn6tq" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.819469 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-nn6tq"] Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.820064 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjp5b\" (UniqueName: \"kubernetes.io/projected/1517f905-d980-43be-8583-f1a40170752e-kube-api-access-rjp5b\") pod \"nova-api-db-create-582lj\" (UID: \"1517f905-d980-43be-8583-f1a40170752e\") " pod="openstack/nova-api-db-create-582lj" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.868267 4804 generic.go:334] "Generic (PLEG): container finished" podID="185b3c31-7ccc-4f8d-bcb1-20cabbf50943" containerID="b16be4415372b664ded381a846b14c2c2406261a1bf459398d286a236434a0be" exitCode=0 Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.868359 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"185b3c31-7ccc-4f8d-bcb1-20cabbf50943","Type":"ContainerDied","Data":"b16be4415372b664ded381a846b14c2c2406261a1bf459398d286a236434a0be"} Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.879556 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2rgm\" (UniqueName: \"kubernetes.io/projected/ccb316de-cd6e-4f79-9387-81f7a8add771-kube-api-access-p2rgm\") pod \"nova-api-570c-account-create-update-48hmw\" (UID: \"ccb316de-cd6e-4f79-9387-81f7a8add771\") " pod="openstack/nova-api-570c-account-create-update-48hmw" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.879616 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3c65a30-a890-4d85-80ca-93f9420d5aa4-operator-scripts\") pod \"nova-cell0-db-create-6h6dp\" (UID: \"f3c65a30-a890-4d85-80ca-93f9420d5aa4\") " pod="openstack/nova-cell0-db-create-6h6dp" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.879656 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mqt5\" (UniqueName: \"kubernetes.io/projected/f3c65a30-a890-4d85-80ca-93f9420d5aa4-kube-api-access-6mqt5\") pod \"nova-cell0-db-create-6h6dp\" (UID: \"f3c65a30-a890-4d85-80ca-93f9420d5aa4\") " pod="openstack/nova-cell0-db-create-6h6dp" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.879807 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ccb316de-cd6e-4f79-9387-81f7a8add771-operator-scripts\") pod \"nova-api-570c-account-create-update-48hmw\" (UID: \"ccb316de-cd6e-4f79-9387-81f7a8add771\") " pod="openstack/nova-api-570c-account-create-update-48hmw" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.884700 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-2eb5-account-create-update-xv5m7"] Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.886583 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2eb5-account-create-update-xv5m7" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.888182 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ccb316de-cd6e-4f79-9387-81f7a8add771-operator-scripts\") pod \"nova-api-570c-account-create-update-48hmw\" (UID: \"ccb316de-cd6e-4f79-9387-81f7a8add771\") " pod="openstack/nova-api-570c-account-create-update-48hmw" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.889554 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-2eb5-account-create-update-xv5m7"] Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.889848 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.914929 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2rgm\" (UniqueName: \"kubernetes.io/projected/ccb316de-cd6e-4f79-9387-81f7a8add771-kube-api-access-p2rgm\") pod \"nova-api-570c-account-create-update-48hmw\" (UID: \"ccb316de-cd6e-4f79-9387-81f7a8add771\") " pod="openstack/nova-api-570c-account-create-update-48hmw" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.936663 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-582lj" Feb 17 13:47:53 crc kubenswrapper[4804]: I0217 13:47:53.970026 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:53.988284 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzbw2\" (UniqueName: \"kubernetes.io/projected/5fa81aac-8f7a-4947-9fbe-c38851b3652e-kube-api-access-zzbw2\") pod \"nova-cell1-db-create-nn6tq\" (UID: \"5fa81aac-8f7a-4947-9fbe-c38851b3652e\") " pod="openstack/nova-cell1-db-create-nn6tq" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:53.988371 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d23eb85-73ab-4049-b6be-486640c922e0-operator-scripts\") pod \"nova-cell0-2eb5-account-create-update-xv5m7\" (UID: \"3d23eb85-73ab-4049-b6be-486640c922e0\") " pod="openstack/nova-cell0-2eb5-account-create-update-xv5m7" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:53.991993 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fa81aac-8f7a-4947-9fbe-c38851b3652e-operator-scripts\") pod \"nova-cell1-db-create-nn6tq\" (UID: \"5fa81aac-8f7a-4947-9fbe-c38851b3652e\") " pod="openstack/nova-cell1-db-create-nn6tq" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:53.992160 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3c65a30-a890-4d85-80ca-93f9420d5aa4-operator-scripts\") pod \"nova-cell0-db-create-6h6dp\" (UID: \"f3c65a30-a890-4d85-80ca-93f9420d5aa4\") " pod="openstack/nova-cell0-db-create-6h6dp" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:53.992234 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mqt5\" (UniqueName: \"kubernetes.io/projected/f3c65a30-a890-4d85-80ca-93f9420d5aa4-kube-api-access-6mqt5\") pod \"nova-cell0-db-create-6h6dp\" (UID: \"f3c65a30-a890-4d85-80ca-93f9420d5aa4\") " pod="openstack/nova-cell0-db-create-6h6dp" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:53.992304 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt872\" (UniqueName: \"kubernetes.io/projected/3d23eb85-73ab-4049-b6be-486640c922e0-kube-api-access-qt872\") pod \"nova-cell0-2eb5-account-create-update-xv5m7\" (UID: \"3d23eb85-73ab-4049-b6be-486640c922e0\") " pod="openstack/nova-cell0-2eb5-account-create-update-xv5m7" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:53.993789 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3c65a30-a890-4d85-80ca-93f9420d5aa4-operator-scripts\") pod \"nova-cell0-db-create-6h6dp\" (UID: \"f3c65a30-a890-4d85-80ca-93f9420d5aa4\") " pod="openstack/nova-cell0-db-create-6h6dp" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.007659 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-570c-account-create-update-48hmw" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.008825 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mqt5\" (UniqueName: \"kubernetes.io/projected/f3c65a30-a890-4d85-80ca-93f9420d5aa4-kube-api-access-6mqt5\") pod \"nova-cell0-db-create-6h6dp\" (UID: \"f3c65a30-a890-4d85-80ca-93f9420d5aa4\") " pod="openstack/nova-cell0-db-create-6h6dp" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.067207 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-6388-account-create-update-skdjv"] Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.068171 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-6h6dp" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.069332 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6388-account-create-update-skdjv" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.083629 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.090578 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-6388-account-create-update-skdjv"] Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.095929 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fa81aac-8f7a-4947-9fbe-c38851b3652e-operator-scripts\") pod \"nova-cell1-db-create-nn6tq\" (UID: \"5fa81aac-8f7a-4947-9fbe-c38851b3652e\") " pod="openstack/nova-cell1-db-create-nn6tq" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.096066 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt872\" (UniqueName: \"kubernetes.io/projected/3d23eb85-73ab-4049-b6be-486640c922e0-kube-api-access-qt872\") pod \"nova-cell0-2eb5-account-create-update-xv5m7\" (UID: \"3d23eb85-73ab-4049-b6be-486640c922e0\") " pod="openstack/nova-cell0-2eb5-account-create-update-xv5m7" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.096245 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzbw2\" (UniqueName: \"kubernetes.io/projected/5fa81aac-8f7a-4947-9fbe-c38851b3652e-kube-api-access-zzbw2\") pod \"nova-cell1-db-create-nn6tq\" (UID: \"5fa81aac-8f7a-4947-9fbe-c38851b3652e\") " pod="openstack/nova-cell1-db-create-nn6tq" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.096288 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d23eb85-73ab-4049-b6be-486640c922e0-operator-scripts\") pod \"nova-cell0-2eb5-account-create-update-xv5m7\" (UID: \"3d23eb85-73ab-4049-b6be-486640c922e0\") " pod="openstack/nova-cell0-2eb5-account-create-update-xv5m7" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.097082 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d23eb85-73ab-4049-b6be-486640c922e0-operator-scripts\") pod \"nova-cell0-2eb5-account-create-update-xv5m7\" (UID: \"3d23eb85-73ab-4049-b6be-486640c922e0\") " pod="openstack/nova-cell0-2eb5-account-create-update-xv5m7" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.097597 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fa81aac-8f7a-4947-9fbe-c38851b3652e-operator-scripts\") pod \"nova-cell1-db-create-nn6tq\" (UID: \"5fa81aac-8f7a-4947-9fbe-c38851b3652e\") " pod="openstack/nova-cell1-db-create-nn6tq" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.117542 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt872\" (UniqueName: \"kubernetes.io/projected/3d23eb85-73ab-4049-b6be-486640c922e0-kube-api-access-qt872\") pod \"nova-cell0-2eb5-account-create-update-xv5m7\" (UID: \"3d23eb85-73ab-4049-b6be-486640c922e0\") " pod="openstack/nova-cell0-2eb5-account-create-update-xv5m7" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.118925 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzbw2\" (UniqueName: \"kubernetes.io/projected/5fa81aac-8f7a-4947-9fbe-c38851b3652e-kube-api-access-zzbw2\") pod \"nova-cell1-db-create-nn6tq\" (UID: \"5fa81aac-8f7a-4947-9fbe-c38851b3652e\") " pod="openstack/nova-cell1-db-create-nn6tq" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.166077 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.194736 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nn6tq" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.197889 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjrg7\" (UniqueName: \"kubernetes.io/projected/92d9081e-1e94-4244-b66a-34b05bc98f2d-kube-api-access-mjrg7\") pod \"nova-cell1-6388-account-create-update-skdjv\" (UID: \"92d9081e-1e94-4244-b66a-34b05bc98f2d\") " pod="openstack/nova-cell1-6388-account-create-update-skdjv" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.198030 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92d9081e-1e94-4244-b66a-34b05bc98f2d-operator-scripts\") pod \"nova-cell1-6388-account-create-update-skdjv\" (UID: \"92d9081e-1e94-4244-b66a-34b05bc98f2d\") " pod="openstack/nova-cell1-6388-account-create-update-skdjv" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.282297 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2eb5-account-create-update-xv5m7" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.299496 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-config-data\") pod \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.300037 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-httpd-run\") pod \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.300224 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-combined-ca-bundle\") pod \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.300262 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-scripts\") pod \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.300328 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.300348 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-logs\") pod \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.300393 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-public-tls-certs\") pod \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.300473 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9djck\" (UniqueName: \"kubernetes.io/projected/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-kube-api-access-9djck\") pod \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\" (UID: \"185b3c31-7ccc-4f8d-bcb1-20cabbf50943\") " Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.300729 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "185b3c31-7ccc-4f8d-bcb1-20cabbf50943" (UID: "185b3c31-7ccc-4f8d-bcb1-20cabbf50943"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.300915 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92d9081e-1e94-4244-b66a-34b05bc98f2d-operator-scripts\") pod \"nova-cell1-6388-account-create-update-skdjv\" (UID: \"92d9081e-1e94-4244-b66a-34b05bc98f2d\") " pod="openstack/nova-cell1-6388-account-create-update-skdjv" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.301042 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjrg7\" (UniqueName: \"kubernetes.io/projected/92d9081e-1e94-4244-b66a-34b05bc98f2d-kube-api-access-mjrg7\") pod \"nova-cell1-6388-account-create-update-skdjv\" (UID: \"92d9081e-1e94-4244-b66a-34b05bc98f2d\") " pod="openstack/nova-cell1-6388-account-create-update-skdjv" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.301206 4804 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.302834 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92d9081e-1e94-4244-b66a-34b05bc98f2d-operator-scripts\") pod \"nova-cell1-6388-account-create-update-skdjv\" (UID: \"92d9081e-1e94-4244-b66a-34b05bc98f2d\") " pod="openstack/nova-cell1-6388-account-create-update-skdjv" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.304670 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-logs" (OuterVolumeSpecName: "logs") pod "185b3c31-7ccc-4f8d-bcb1-20cabbf50943" (UID: "185b3c31-7ccc-4f8d-bcb1-20cabbf50943"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.307263 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-scripts" (OuterVolumeSpecName: "scripts") pod "185b3c31-7ccc-4f8d-bcb1-20cabbf50943" (UID: "185b3c31-7ccc-4f8d-bcb1-20cabbf50943"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.315879 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "185b3c31-7ccc-4f8d-bcb1-20cabbf50943" (UID: "185b3c31-7ccc-4f8d-bcb1-20cabbf50943"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.319997 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-kube-api-access-9djck" (OuterVolumeSpecName: "kube-api-access-9djck") pod "185b3c31-7ccc-4f8d-bcb1-20cabbf50943" (UID: "185b3c31-7ccc-4f8d-bcb1-20cabbf50943"). InnerVolumeSpecName "kube-api-access-9djck". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.324191 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjrg7\" (UniqueName: \"kubernetes.io/projected/92d9081e-1e94-4244-b66a-34b05bc98f2d-kube-api-access-mjrg7\") pod \"nova-cell1-6388-account-create-update-skdjv\" (UID: \"92d9081e-1e94-4244-b66a-34b05bc98f2d\") " pod="openstack/nova-cell1-6388-account-create-update-skdjv" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.363152 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "185b3c31-7ccc-4f8d-bcb1-20cabbf50943" (UID: "185b3c31-7ccc-4f8d-bcb1-20cabbf50943"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.400670 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "185b3c31-7ccc-4f8d-bcb1-20cabbf50943" (UID: "185b3c31-7ccc-4f8d-bcb1-20cabbf50943"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.403183 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9djck\" (UniqueName: \"kubernetes.io/projected/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-kube-api-access-9djck\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.403256 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.403269 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.403278 4804 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-logs\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.403303 4804 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.403313 4804 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.419760 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6388-account-create-update-skdjv" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.420946 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-config-data" (OuterVolumeSpecName: "config-data") pod "185b3c31-7ccc-4f8d-bcb1-20cabbf50943" (UID: "185b3c31-7ccc-4f8d-bcb1-20cabbf50943"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.438382 4804 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.507247 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/185b3c31-7ccc-4f8d-bcb1-20cabbf50943-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.507498 4804 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.599387 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da1535f5-a225-489d-af6d-cbfa6042d239" path="/var/lib/kubelet/pods/da1535f5-a225-489d-af6d-cbfa6042d239/volumes" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.600436 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-582lj"] Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.669783 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.764447 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-nn6tq"] Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.779377 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-570c-account-create-update-48hmw"] Feb 17 13:47:54 crc kubenswrapper[4804]: W0217 13:47:54.790376 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podccb316de_cd6e_4f79_9387_81f7a8add771.slice/crio-199b20de65a928c87690cfbe6b6a25c5d3467e26b72add5ce1fab6172ff92b03 WatchSource:0}: Error finding container 199b20de65a928c87690cfbe6b6a25c5d3467e26b72add5ce1fab6172ff92b03: Status 404 returned error can't find the container with id 199b20de65a928c87690cfbe6b6a25c5d3467e26b72add5ce1fab6172ff92b03 Feb 17 13:47:54 crc kubenswrapper[4804]: W0217 13:47:54.790628 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5fa81aac_8f7a_4947_9fbe_c38851b3652e.slice/crio-b39e64891804c87744085d0038f167adab37c91526911b1d675fc165390c9ae6 WatchSource:0}: Error finding container b39e64891804c87744085d0038f167adab37c91526911b1d675fc165390c9ae6: Status 404 returned error can't find the container with id b39e64891804c87744085d0038f167adab37c91526911b1d675fc165390c9ae6 Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.794668 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-6h6dp"] Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.931162 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-582lj" event={"ID":"1517f905-d980-43be-8583-f1a40170752e","Type":"ContainerStarted","Data":"76d722774285224a6de60017eb8318c4877ef97f9d26d58e45fd8422945c25d0"} Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.931221 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-582lj" event={"ID":"1517f905-d980-43be-8583-f1a40170752e","Type":"ContainerStarted","Data":"a5ffec33f37d9af4010f0839f331e730d0f7b8e15e52fe90f544650f10da490a"} Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.946004 4804 generic.go:334] "Generic (PLEG): container finished" podID="8ec519a7-9081-4341-ad6c-c81dda70bd3a" containerID="74e9c41b66fa02c3d94931a9817572fd799183a1707f607e072d3c3dddd9e96b" exitCode=0 Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.946089 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ec519a7-9081-4341-ad6c-c81dda70bd3a","Type":"ContainerDied","Data":"74e9c41b66fa02c3d94931a9817572fd799183a1707f607e072d3c3dddd9e96b"} Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.951943 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"185b3c31-7ccc-4f8d-bcb1-20cabbf50943","Type":"ContainerDied","Data":"eefbbd3f0b520bf32a3f3135f04b4227a82b1cef683b398cd1cff8682da24dc5"} Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.951996 4804 scope.go:117] "RemoveContainer" containerID="b16be4415372b664ded381a846b14c2c2406261a1bf459398d286a236434a0be" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.952004 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.954803 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nn6tq" event={"ID":"5fa81aac-8f7a-4947-9fbe-c38851b3652e","Type":"ContainerStarted","Data":"b39e64891804c87744085d0038f167adab37c91526911b1d675fc165390c9ae6"} Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.960500 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-6h6dp" event={"ID":"f3c65a30-a890-4d85-80ca-93f9420d5aa4","Type":"ContainerStarted","Data":"7fc6124a6d90d9e051ea54c2356413344027c38ef2bbf16638c22f2aa3317a37"} Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.974778 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc","Type":"ContainerStarted","Data":"b3be3c859965ba94fbed2ed95fd8afc727f019eeb34bd5903d88d5dbf4d77e51"} Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.982486 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.982801 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-570c-account-create-update-48hmw" event={"ID":"ccb316de-cd6e-4f79-9387-81f7a8add771","Type":"ContainerStarted","Data":"199b20de65a928c87690cfbe6b6a25c5d3467e26b72add5ce1fab6172ff92b03"} Feb 17 13:47:54 crc kubenswrapper[4804]: I0217 13:47:54.993929 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.003553 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-6388-account-create-update-skdjv"] Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.014289 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:47:55 crc kubenswrapper[4804]: E0217 13:47:55.014819 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="185b3c31-7ccc-4f8d-bcb1-20cabbf50943" containerName="glance-log" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.014840 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="185b3c31-7ccc-4f8d-bcb1-20cabbf50943" containerName="glance-log" Feb 17 13:47:55 crc kubenswrapper[4804]: E0217 13:47:55.014874 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="185b3c31-7ccc-4f8d-bcb1-20cabbf50943" containerName="glance-httpd" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.014880 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="185b3c31-7ccc-4f8d-bcb1-20cabbf50943" containerName="glance-httpd" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.015053 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="185b3c31-7ccc-4f8d-bcb1-20cabbf50943" containerName="glance-httpd" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.015074 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="185b3c31-7ccc-4f8d-bcb1-20cabbf50943" containerName="glance-log" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.016147 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.019888 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-2eb5-account-create-update-xv5m7"] Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.021543 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.027558 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.048828 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.071940 4804 scope.go:117] "RemoveContainer" containerID="953b973c887ab4f0021ac7303d1d54237d7d43957bc03fe8e6222019de4450e9" Feb 17 13:47:55 crc kubenswrapper[4804]: W0217 13:47:55.109393 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92d9081e_1e94_4244_b66a_34b05bc98f2d.slice/crio-0430e4597f7a0ce9032791766bb4dd9708a3c867ce5a3db25a833e2a1bde6abd WatchSource:0}: Error finding container 0430e4597f7a0ce9032791766bb4dd9708a3c867ce5a3db25a833e2a1bde6abd: Status 404 returned error can't find the container with id 0430e4597f7a0ce9032791766bb4dd9708a3c867ce5a3db25a833e2a1bde6abd Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.137569 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.137642 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc2e7136-825b-4608-a106-944f359c7369-logs\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.137689 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc2e7136-825b-4608-a106-944f359c7369-scripts\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.137707 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc2e7136-825b-4608-a106-944f359c7369-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.137738 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc2e7136-825b-4608-a106-944f359c7369-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.137765 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79qkq\" (UniqueName: \"kubernetes.io/projected/cc2e7136-825b-4608-a106-944f359c7369-kube-api-access-79qkq\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.137779 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc2e7136-825b-4608-a106-944f359c7369-config-data\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.137824 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cc2e7136-825b-4608-a106-944f359c7369-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.238991 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc2e7136-825b-4608-a106-944f359c7369-logs\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.239061 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc2e7136-825b-4608-a106-944f359c7369-scripts\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.239080 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc2e7136-825b-4608-a106-944f359c7369-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.239111 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc2e7136-825b-4608-a106-944f359c7369-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.239139 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79qkq\" (UniqueName: \"kubernetes.io/projected/cc2e7136-825b-4608-a106-944f359c7369-kube-api-access-79qkq\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.239157 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc2e7136-825b-4608-a106-944f359c7369-config-data\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.239254 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cc2e7136-825b-4608-a106-944f359c7369-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.239297 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.239665 4804 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.248856 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cc2e7136-825b-4608-a106-944f359c7369-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.249084 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc2e7136-825b-4608-a106-944f359c7369-logs\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.254957 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc2e7136-825b-4608-a106-944f359c7369-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.254988 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc2e7136-825b-4608-a106-944f359c7369-scripts\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.259820 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc2e7136-825b-4608-a106-944f359c7369-config-data\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.260581 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc2e7136-825b-4608-a106-944f359c7369-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.264262 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79qkq\" (UniqueName: \"kubernetes.io/projected/cc2e7136-825b-4608-a106-944f359c7369-kube-api-access-79qkq\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.273962 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"cc2e7136-825b-4608-a106-944f359c7369\") " pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.392753 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.565474 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.656787 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.656885 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-combined-ca-bundle\") pod \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.656922 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scvw2\" (UniqueName: \"kubernetes.io/projected/8ec519a7-9081-4341-ad6c-c81dda70bd3a-kube-api-access-scvw2\") pod \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.656976 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-config-data\") pod \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.657063 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-scripts\") pod \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.657091 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-internal-tls-certs\") pod \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.657130 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ec519a7-9081-4341-ad6c-c81dda70bd3a-httpd-run\") pod \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.657162 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ec519a7-9081-4341-ad6c-c81dda70bd3a-logs\") pod \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\" (UID: \"8ec519a7-9081-4341-ad6c-c81dda70bd3a\") " Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.659148 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ec519a7-9081-4341-ad6c-c81dda70bd3a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8ec519a7-9081-4341-ad6c-c81dda70bd3a" (UID: "8ec519a7-9081-4341-ad6c-c81dda70bd3a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.660565 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ec519a7-9081-4341-ad6c-c81dda70bd3a-logs" (OuterVolumeSpecName: "logs") pod "8ec519a7-9081-4341-ad6c-c81dda70bd3a" (UID: "8ec519a7-9081-4341-ad6c-c81dda70bd3a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.688851 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "8ec519a7-9081-4341-ad6c-c81dda70bd3a" (UID: "8ec519a7-9081-4341-ad6c-c81dda70bd3a"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.691533 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-scripts" (OuterVolumeSpecName: "scripts") pod "8ec519a7-9081-4341-ad6c-c81dda70bd3a" (UID: "8ec519a7-9081-4341-ad6c-c81dda70bd3a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.700559 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ec519a7-9081-4341-ad6c-c81dda70bd3a-kube-api-access-scvw2" (OuterVolumeSpecName: "kube-api-access-scvw2") pod "8ec519a7-9081-4341-ad6c-c81dda70bd3a" (UID: "8ec519a7-9081-4341-ad6c-c81dda70bd3a"). InnerVolumeSpecName "kube-api-access-scvw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.756466 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-config-data" (OuterVolumeSpecName: "config-data") pod "8ec519a7-9081-4341-ad6c-c81dda70bd3a" (UID: "8ec519a7-9081-4341-ad6c-c81dda70bd3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.758631 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ec519a7-9081-4341-ad6c-c81dda70bd3a" (UID: "8ec519a7-9081-4341-ad6c-c81dda70bd3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.759423 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.759450 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scvw2\" (UniqueName: \"kubernetes.io/projected/8ec519a7-9081-4341-ad6c-c81dda70bd3a-kube-api-access-scvw2\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.759461 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.759470 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.759478 4804 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ec519a7-9081-4341-ad6c-c81dda70bd3a-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.759490 4804 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ec519a7-9081-4341-ad6c-c81dda70bd3a-logs\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.759508 4804 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.818612 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8ec519a7-9081-4341-ad6c-c81dda70bd3a" (UID: "8ec519a7-9081-4341-ad6c-c81dda70bd3a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.822067 4804 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.861746 4804 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.861793 4804 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ec519a7-9081-4341-ad6c-c81dda70bd3a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.994177 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ec519a7-9081-4341-ad6c-c81dda70bd3a","Type":"ContainerDied","Data":"ba1329d4b79ba312c3f527f0d612e3cd76cb2acbaa7d0c300741c813abd79d36"} Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.994243 4804 scope.go:117] "RemoveContainer" containerID="74e9c41b66fa02c3d94931a9817572fd799183a1707f607e072d3c3dddd9e96b" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.994375 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 13:47:55 crc kubenswrapper[4804]: I0217 13:47:55.999734 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6388-account-create-update-skdjv" event={"ID":"92d9081e-1e94-4244-b66a-34b05bc98f2d","Type":"ContainerStarted","Data":"0430e4597f7a0ce9032791766bb4dd9708a3c867ce5a3db25a833e2a1bde6abd"} Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.000873 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2eb5-account-create-update-xv5m7" event={"ID":"3d23eb85-73ab-4049-b6be-486640c922e0","Type":"ContainerStarted","Data":"c524172161ffac83a0b6e7a5805c119f237374e27cb6f6b470e9d29ed3840c55"} Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.005161 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-570c-account-create-update-48hmw" event={"ID":"ccb316de-cd6e-4f79-9387-81f7a8add771","Type":"ContainerStarted","Data":"feb7469a90aaa528b89392a82772cfa0640653aa5ae69effdca1ed55e8c2a1de"} Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.038939 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-582lj" podStartSLOduration=3.038917464 podStartE2EDuration="3.038917464s" podCreationTimestamp="2026-02-17 13:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:56.024313285 +0000 UTC m=+1350.135732632" watchObservedRunningTime="2026-02-17 13:47:56.038917464 +0000 UTC m=+1350.150336801" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.056538 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.078294 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.094561 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 13:47:56 crc kubenswrapper[4804]: E0217 13:47:56.094997 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ec519a7-9081-4341-ad6c-c81dda70bd3a" containerName="glance-log" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.095022 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ec519a7-9081-4341-ad6c-c81dda70bd3a" containerName="glance-log" Feb 17 13:47:56 crc kubenswrapper[4804]: E0217 13:47:56.095073 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ec519a7-9081-4341-ad6c-c81dda70bd3a" containerName="glance-httpd" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.095082 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ec519a7-9081-4341-ad6c-c81dda70bd3a" containerName="glance-httpd" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.095317 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ec519a7-9081-4341-ad6c-c81dda70bd3a" containerName="glance-log" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.095342 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ec519a7-9081-4341-ad6c-c81dda70bd3a" containerName="glance-httpd" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.096392 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.101703 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.101912 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.102923 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.173635 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52f268a5-3c72-4655-bb36-823c34e5312d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.173716 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2rcg\" (UniqueName: \"kubernetes.io/projected/52f268a5-3c72-4655-bb36-823c34e5312d-kube-api-access-b2rcg\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.173806 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52f268a5-3c72-4655-bb36-823c34e5312d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.173852 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52f268a5-3c72-4655-bb36-823c34e5312d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.173913 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52f268a5-3c72-4655-bb36-823c34e5312d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.173982 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52f268a5-3c72-4655-bb36-823c34e5312d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.174032 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52f268a5-3c72-4655-bb36-823c34e5312d-logs\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.174077 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.275713 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52f268a5-3c72-4655-bb36-823c34e5312d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.275851 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52f268a5-3c72-4655-bb36-823c34e5312d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.275884 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52f268a5-3c72-4655-bb36-823c34e5312d-logs\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.275923 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.275976 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52f268a5-3c72-4655-bb36-823c34e5312d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.276026 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2rcg\" (UniqueName: \"kubernetes.io/projected/52f268a5-3c72-4655-bb36-823c34e5312d-kube-api-access-b2rcg\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.276101 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52f268a5-3c72-4655-bb36-823c34e5312d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.276140 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52f268a5-3c72-4655-bb36-823c34e5312d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.276272 4804 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.276530 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52f268a5-3c72-4655-bb36-823c34e5312d-logs\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.277110 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52f268a5-3c72-4655-bb36-823c34e5312d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.280754 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52f268a5-3c72-4655-bb36-823c34e5312d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.281078 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52f268a5-3c72-4655-bb36-823c34e5312d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.281537 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52f268a5-3c72-4655-bb36-823c34e5312d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.281569 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52f268a5-3c72-4655-bb36-823c34e5312d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.295463 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2rcg\" (UniqueName: \"kubernetes.io/projected/52f268a5-3c72-4655-bb36-823c34e5312d-kube-api-access-b2rcg\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.311328 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"52f268a5-3c72-4655-bb36-823c34e5312d\") " pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.415316 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.482429 4804 scope.go:117] "RemoveContainer" containerID="3c023d82e32da3e66e3f80b40ff960f9faffbfd6b13149e23d95974790def49f" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.639446 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="185b3c31-7ccc-4f8d-bcb1-20cabbf50943" path="/var/lib/kubelet/pods/185b3c31-7ccc-4f8d-bcb1-20cabbf50943/volumes" Feb 17 13:47:56 crc kubenswrapper[4804]: I0217 13:47:56.640329 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ec519a7-9081-4341-ad6c-c81dda70bd3a" path="/var/lib/kubelet/pods/8ec519a7-9081-4341-ad6c-c81dda70bd3a/volumes" Feb 17 13:47:57 crc kubenswrapper[4804]: I0217 13:47:57.014790 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-6h6dp" event={"ID":"f3c65a30-a890-4d85-80ca-93f9420d5aa4","Type":"ContainerStarted","Data":"75003012d3c522e6a637465c31ac382126c2c3ac2eb1897adb68193823f330ce"} Feb 17 13:47:57 crc kubenswrapper[4804]: I0217 13:47:57.018285 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nn6tq" event={"ID":"5fa81aac-8f7a-4947-9fbe-c38851b3652e","Type":"ContainerStarted","Data":"62fe2cdf4668625c3cfd915d4fddf1e341b2d4a545fb2af5d424708a57a7a4a3"} Feb 17 13:47:57 crc kubenswrapper[4804]: I0217 13:47:57.041022 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-6h6dp" podStartSLOduration=4.040997403 podStartE2EDuration="4.040997403s" podCreationTimestamp="2026-02-17 13:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:57.032139324 +0000 UTC m=+1351.143558671" watchObservedRunningTime="2026-02-17 13:47:57.040997403 +0000 UTC m=+1351.152416750" Feb 17 13:47:57 crc kubenswrapper[4804]: I0217 13:47:57.051517 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-nn6tq" podStartSLOduration=4.051493893 podStartE2EDuration="4.051493893s" podCreationTimestamp="2026-02-17 13:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:57.046651991 +0000 UTC m=+1351.158071328" watchObservedRunningTime="2026-02-17 13:47:57.051493893 +0000 UTC m=+1351.162913230" Feb 17 13:47:57 crc kubenswrapper[4804]: I0217 13:47:57.064126 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-570c-account-create-update-48hmw" podStartSLOduration=4.06410912 podStartE2EDuration="4.06410912s" podCreationTimestamp="2026-02-17 13:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:57.061531309 +0000 UTC m=+1351.172950646" watchObservedRunningTime="2026-02-17 13:47:57.06410912 +0000 UTC m=+1351.175528457" Feb 17 13:47:57 crc kubenswrapper[4804]: I0217 13:47:57.127616 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 13:47:57 crc kubenswrapper[4804]: I0217 13:47:57.265792 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 13:47:57 crc kubenswrapper[4804]: W0217 13:47:57.270672 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52f268a5_3c72_4655_bb36_823c34e5312d.slice/crio-d3346765f7a3bbcc535416a496460719664ec3499f65fff05932d896963ca0ab WatchSource:0}: Error finding container d3346765f7a3bbcc535416a496460719664ec3499f65fff05932d896963ca0ab: Status 404 returned error can't find the container with id d3346765f7a3bbcc535416a496460719664ec3499f65fff05932d896963ca0ab Feb 17 13:47:57 crc kubenswrapper[4804]: I0217 13:47:57.700753 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-c576cfd85-655nj" Feb 17 13:47:57 crc kubenswrapper[4804]: I0217 13:47:57.775243 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-547f989fd6-rqkvc"] Feb 17 13:47:57 crc kubenswrapper[4804]: I0217 13:47:57.775884 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-547f989fd6-rqkvc" podUID="a2f2352e-7e9b-439f-be3c-b48b70681658" containerName="neutron-api" containerID="cri-o://8b150042424233c8bc124629fd2b08c59b6d968c4392bc674f15f7a3a798c1ef" gracePeriod=30 Feb 17 13:47:57 crc kubenswrapper[4804]: I0217 13:47:57.776192 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-547f989fd6-rqkvc" podUID="a2f2352e-7e9b-439f-be3c-b48b70681658" containerName="neutron-httpd" containerID="cri-o://60e764f5ff10863b3c00346932550ad88cccb6cdd9323fc257402af064f3b3ad" gracePeriod=30 Feb 17 13:47:58 crc kubenswrapper[4804]: I0217 13:47:58.049463 4804 generic.go:334] "Generic (PLEG): container finished" podID="5fa81aac-8f7a-4947-9fbe-c38851b3652e" containerID="62fe2cdf4668625c3cfd915d4fddf1e341b2d4a545fb2af5d424708a57a7a4a3" exitCode=0 Feb 17 13:47:58 crc kubenswrapper[4804]: I0217 13:47:58.049549 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nn6tq" event={"ID":"5fa81aac-8f7a-4947-9fbe-c38851b3652e","Type":"ContainerDied","Data":"62fe2cdf4668625c3cfd915d4fddf1e341b2d4a545fb2af5d424708a57a7a4a3"} Feb 17 13:47:58 crc kubenswrapper[4804]: I0217 13:47:58.059429 4804 generic.go:334] "Generic (PLEG): container finished" podID="92d9081e-1e94-4244-b66a-34b05bc98f2d" containerID="cd29054fcbff23437aedab7f24e705fc390169a8546254413b976c34b8bd4901" exitCode=0 Feb 17 13:47:58 crc kubenswrapper[4804]: I0217 13:47:58.059530 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6388-account-create-update-skdjv" event={"ID":"92d9081e-1e94-4244-b66a-34b05bc98f2d","Type":"ContainerDied","Data":"cd29054fcbff23437aedab7f24e705fc390169a8546254413b976c34b8bd4901"} Feb 17 13:47:58 crc kubenswrapper[4804]: I0217 13:47:58.071144 4804 generic.go:334] "Generic (PLEG): container finished" podID="ccb316de-cd6e-4f79-9387-81f7a8add771" containerID="feb7469a90aaa528b89392a82772cfa0640653aa5ae69effdca1ed55e8c2a1de" exitCode=0 Feb 17 13:47:58 crc kubenswrapper[4804]: I0217 13:47:58.071314 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-570c-account-create-update-48hmw" event={"ID":"ccb316de-cd6e-4f79-9387-81f7a8add771","Type":"ContainerDied","Data":"feb7469a90aaa528b89392a82772cfa0640653aa5ae69effdca1ed55e8c2a1de"} Feb 17 13:47:58 crc kubenswrapper[4804]: I0217 13:47:58.078359 4804 generic.go:334] "Generic (PLEG): container finished" podID="a2f2352e-7e9b-439f-be3c-b48b70681658" containerID="60e764f5ff10863b3c00346932550ad88cccb6cdd9323fc257402af064f3b3ad" exitCode=0 Feb 17 13:47:58 crc kubenswrapper[4804]: I0217 13:47:58.078427 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-547f989fd6-rqkvc" event={"ID":"a2f2352e-7e9b-439f-be3c-b48b70681658","Type":"ContainerDied","Data":"60e764f5ff10863b3c00346932550ad88cccb6cdd9323fc257402af064f3b3ad"} Feb 17 13:47:58 crc kubenswrapper[4804]: I0217 13:47:58.085510 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cc2e7136-825b-4608-a106-944f359c7369","Type":"ContainerStarted","Data":"93ebf4cecf403069ba9d9266b5376f8b2ebb0ce31269d457e86f49f2965016a6"} Feb 17 13:47:58 crc kubenswrapper[4804]: I0217 13:47:58.085565 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cc2e7136-825b-4608-a106-944f359c7369","Type":"ContainerStarted","Data":"59c1c254275218fcd0dec8d899a862c6d1bd00eea43efe6566e036ac8b535b56"} Feb 17 13:47:58 crc kubenswrapper[4804]: I0217 13:47:58.089440 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"52f268a5-3c72-4655-bb36-823c34e5312d","Type":"ContainerStarted","Data":"d3346765f7a3bbcc535416a496460719664ec3499f65fff05932d896963ca0ab"} Feb 17 13:47:58 crc kubenswrapper[4804]: I0217 13:47:58.098001 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc","Type":"ContainerStarted","Data":"3c17e8a3044ba81c152d2c764a260f2c9dbcccc9c594584aa4f49d45c1b35256"} Feb 17 13:47:58 crc kubenswrapper[4804]: I0217 13:47:58.111103 4804 generic.go:334] "Generic (PLEG): container finished" podID="3d23eb85-73ab-4049-b6be-486640c922e0" containerID="04848e079d7c3dd5aec9613ff12ec81fb185688c9c0af0d2f63039d17f192069" exitCode=0 Feb 17 13:47:58 crc kubenswrapper[4804]: I0217 13:47:58.111177 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2eb5-account-create-update-xv5m7" event={"ID":"3d23eb85-73ab-4049-b6be-486640c922e0","Type":"ContainerDied","Data":"04848e079d7c3dd5aec9613ff12ec81fb185688c9c0af0d2f63039d17f192069"} Feb 17 13:47:58 crc kubenswrapper[4804]: I0217 13:47:58.115063 4804 generic.go:334] "Generic (PLEG): container finished" podID="1517f905-d980-43be-8583-f1a40170752e" containerID="76d722774285224a6de60017eb8318c4877ef97f9d26d58e45fd8422945c25d0" exitCode=0 Feb 17 13:47:58 crc kubenswrapper[4804]: I0217 13:47:58.115146 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-582lj" event={"ID":"1517f905-d980-43be-8583-f1a40170752e","Type":"ContainerDied","Data":"76d722774285224a6de60017eb8318c4877ef97f9d26d58e45fd8422945c25d0"} Feb 17 13:47:58 crc kubenswrapper[4804]: I0217 13:47:58.136428 4804 generic.go:334] "Generic (PLEG): container finished" podID="f3c65a30-a890-4d85-80ca-93f9420d5aa4" containerID="75003012d3c522e6a637465c31ac382126c2c3ac2eb1897adb68193823f330ce" exitCode=0 Feb 17 13:47:58 crc kubenswrapper[4804]: I0217 13:47:58.136663 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-6h6dp" event={"ID":"f3c65a30-a890-4d85-80ca-93f9420d5aa4","Type":"ContainerDied","Data":"75003012d3c522e6a637465c31ac382126c2c3ac2eb1897adb68193823f330ce"} Feb 17 13:47:59 crc kubenswrapper[4804]: I0217 13:47:59.146575 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc","Type":"ContainerStarted","Data":"7760a459c274d77084f2e91289ebcf9b60005e0f3aedd7ef868b73f7e33c8293"} Feb 17 13:47:59 crc kubenswrapper[4804]: I0217 13:47:59.149615 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cc2e7136-825b-4608-a106-944f359c7369","Type":"ContainerStarted","Data":"19f884a664455e9314665febd087d4c25c36dd754e8ab45ab4f158f1edbff08d"} Feb 17 13:47:59 crc kubenswrapper[4804]: I0217 13:47:59.151909 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"52f268a5-3c72-4655-bb36-823c34e5312d","Type":"ContainerStarted","Data":"c0cebc094d70bff51090146dd4586fa1f95f69a8f0dce5560eb8a6ad904ae9aa"} Feb 17 13:47:59 crc kubenswrapper[4804]: I0217 13:47:59.151940 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"52f268a5-3c72-4655-bb36-823c34e5312d","Type":"ContainerStarted","Data":"905daab68904cbf4ce0b38c94e620f7c4eb4d2d220a0a793f2285c0b9e8354ea"} Feb 17 13:47:59 crc kubenswrapper[4804]: I0217 13:47:59.187332 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.187307259 podStartE2EDuration="5.187307259s" podCreationTimestamp="2026-02-17 13:47:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:59.17303997 +0000 UTC m=+1353.284459307" watchObservedRunningTime="2026-02-17 13:47:59.187307259 +0000 UTC m=+1353.298726596" Feb 17 13:47:59 crc kubenswrapper[4804]: I0217 13:47:59.863545 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-582lj" Feb 17 13:47:59 crc kubenswrapper[4804]: I0217 13:47:59.894705 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.8946875949999997 podStartE2EDuration="3.894687595s" podCreationTimestamp="2026-02-17 13:47:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:47:59.203975383 +0000 UTC m=+1353.315394720" watchObservedRunningTime="2026-02-17 13:47:59.894687595 +0000 UTC m=+1354.006106922" Feb 17 13:47:59 crc kubenswrapper[4804]: I0217 13:47:59.920311 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nn6tq" Feb 17 13:47:59 crc kubenswrapper[4804]: I0217 13:47:59.954341 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6388-account-create-update-skdjv" Feb 17 13:47:59 crc kubenswrapper[4804]: I0217 13:47:59.973969 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-570c-account-create-update-48hmw" Feb 17 13:47:59 crc kubenswrapper[4804]: I0217 13:47:59.982763 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-6h6dp" Feb 17 13:47:59 crc kubenswrapper[4804]: I0217 13:47:59.993738 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2eb5-account-create-update-xv5m7" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.000859 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjp5b\" (UniqueName: \"kubernetes.io/projected/1517f905-d980-43be-8583-f1a40170752e-kube-api-access-rjp5b\") pod \"1517f905-d980-43be-8583-f1a40170752e\" (UID: \"1517f905-d980-43be-8583-f1a40170752e\") " Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.001113 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1517f905-d980-43be-8583-f1a40170752e-operator-scripts\") pod \"1517f905-d980-43be-8583-f1a40170752e\" (UID: \"1517f905-d980-43be-8583-f1a40170752e\") " Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.001900 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1517f905-d980-43be-8583-f1a40170752e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1517f905-d980-43be-8583-f1a40170752e" (UID: "1517f905-d980-43be-8583-f1a40170752e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.015757 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1517f905-d980-43be-8583-f1a40170752e-kube-api-access-rjp5b" (OuterVolumeSpecName: "kube-api-access-rjp5b") pod "1517f905-d980-43be-8583-f1a40170752e" (UID: "1517f905-d980-43be-8583-f1a40170752e"). InnerVolumeSpecName "kube-api-access-rjp5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.102477 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92d9081e-1e94-4244-b66a-34b05bc98f2d-operator-scripts\") pod \"92d9081e-1e94-4244-b66a-34b05bc98f2d\" (UID: \"92d9081e-1e94-4244-b66a-34b05bc98f2d\") " Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.102524 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3c65a30-a890-4d85-80ca-93f9420d5aa4-operator-scripts\") pod \"f3c65a30-a890-4d85-80ca-93f9420d5aa4\" (UID: \"f3c65a30-a890-4d85-80ca-93f9420d5aa4\") " Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.102549 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2rgm\" (UniqueName: \"kubernetes.io/projected/ccb316de-cd6e-4f79-9387-81f7a8add771-kube-api-access-p2rgm\") pod \"ccb316de-cd6e-4f79-9387-81f7a8add771\" (UID: \"ccb316de-cd6e-4f79-9387-81f7a8add771\") " Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.102572 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mqt5\" (UniqueName: \"kubernetes.io/projected/f3c65a30-a890-4d85-80ca-93f9420d5aa4-kube-api-access-6mqt5\") pod \"f3c65a30-a890-4d85-80ca-93f9420d5aa4\" (UID: \"f3c65a30-a890-4d85-80ca-93f9420d5aa4\") " Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.102633 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d23eb85-73ab-4049-b6be-486640c922e0-operator-scripts\") pod \"3d23eb85-73ab-4049-b6be-486640c922e0\" (UID: \"3d23eb85-73ab-4049-b6be-486640c922e0\") " Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.102664 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjrg7\" (UniqueName: \"kubernetes.io/projected/92d9081e-1e94-4244-b66a-34b05bc98f2d-kube-api-access-mjrg7\") pod \"92d9081e-1e94-4244-b66a-34b05bc98f2d\" (UID: \"92d9081e-1e94-4244-b66a-34b05bc98f2d\") " Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.102705 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fa81aac-8f7a-4947-9fbe-c38851b3652e-operator-scripts\") pod \"5fa81aac-8f7a-4947-9fbe-c38851b3652e\" (UID: \"5fa81aac-8f7a-4947-9fbe-c38851b3652e\") " Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.102790 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qt872\" (UniqueName: \"kubernetes.io/projected/3d23eb85-73ab-4049-b6be-486640c922e0-kube-api-access-qt872\") pod \"3d23eb85-73ab-4049-b6be-486640c922e0\" (UID: \"3d23eb85-73ab-4049-b6be-486640c922e0\") " Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.102811 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ccb316de-cd6e-4f79-9387-81f7a8add771-operator-scripts\") pod \"ccb316de-cd6e-4f79-9387-81f7a8add771\" (UID: \"ccb316de-cd6e-4f79-9387-81f7a8add771\") " Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.102911 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzbw2\" (UniqueName: \"kubernetes.io/projected/5fa81aac-8f7a-4947-9fbe-c38851b3652e-kube-api-access-zzbw2\") pod \"5fa81aac-8f7a-4947-9fbe-c38851b3652e\" (UID: \"5fa81aac-8f7a-4947-9fbe-c38851b3652e\") " Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.103361 4804 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1517f905-d980-43be-8583-f1a40170752e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.103375 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjp5b\" (UniqueName: \"kubernetes.io/projected/1517f905-d980-43be-8583-f1a40170752e-kube-api-access-rjp5b\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.103767 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d23eb85-73ab-4049-b6be-486640c922e0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3d23eb85-73ab-4049-b6be-486640c922e0" (UID: "3d23eb85-73ab-4049-b6be-486640c922e0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.103797 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92d9081e-1e94-4244-b66a-34b05bc98f2d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "92d9081e-1e94-4244-b66a-34b05bc98f2d" (UID: "92d9081e-1e94-4244-b66a-34b05bc98f2d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.103784 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fa81aac-8f7a-4947-9fbe-c38851b3652e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5fa81aac-8f7a-4947-9fbe-c38851b3652e" (UID: "5fa81aac-8f7a-4947-9fbe-c38851b3652e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.104146 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccb316de-cd6e-4f79-9387-81f7a8add771-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ccb316de-cd6e-4f79-9387-81f7a8add771" (UID: "ccb316de-cd6e-4f79-9387-81f7a8add771"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.104997 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3c65a30-a890-4d85-80ca-93f9420d5aa4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f3c65a30-a890-4d85-80ca-93f9420d5aa4" (UID: "f3c65a30-a890-4d85-80ca-93f9420d5aa4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.109798 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fa81aac-8f7a-4947-9fbe-c38851b3652e-kube-api-access-zzbw2" (OuterVolumeSpecName: "kube-api-access-zzbw2") pod "5fa81aac-8f7a-4947-9fbe-c38851b3652e" (UID: "5fa81aac-8f7a-4947-9fbe-c38851b3652e"). InnerVolumeSpecName "kube-api-access-zzbw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.114467 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92d9081e-1e94-4244-b66a-34b05bc98f2d-kube-api-access-mjrg7" (OuterVolumeSpecName: "kube-api-access-mjrg7") pod "92d9081e-1e94-4244-b66a-34b05bc98f2d" (UID: "92d9081e-1e94-4244-b66a-34b05bc98f2d"). InnerVolumeSpecName "kube-api-access-mjrg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.114580 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d23eb85-73ab-4049-b6be-486640c922e0-kube-api-access-qt872" (OuterVolumeSpecName: "kube-api-access-qt872") pod "3d23eb85-73ab-4049-b6be-486640c922e0" (UID: "3d23eb85-73ab-4049-b6be-486640c922e0"). InnerVolumeSpecName "kube-api-access-qt872". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.117007 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccb316de-cd6e-4f79-9387-81f7a8add771-kube-api-access-p2rgm" (OuterVolumeSpecName: "kube-api-access-p2rgm") pod "ccb316de-cd6e-4f79-9387-81f7a8add771" (UID: "ccb316de-cd6e-4f79-9387-81f7a8add771"). InnerVolumeSpecName "kube-api-access-p2rgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.117829 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3c65a30-a890-4d85-80ca-93f9420d5aa4-kube-api-access-6mqt5" (OuterVolumeSpecName: "kube-api-access-6mqt5") pod "f3c65a30-a890-4d85-80ca-93f9420d5aa4" (UID: "f3c65a30-a890-4d85-80ca-93f9420d5aa4"). InnerVolumeSpecName "kube-api-access-6mqt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.167595 4804 generic.go:334] "Generic (PLEG): container finished" podID="85415d6a-8a5f-4b65-b182-2bfe221e8eee" containerID="d85afc401ad87104d844d4c1c5c56bfe2224eb996820680ca9a6f48ab88469e3" exitCode=137 Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.167668 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58989b55cb-zjfvf" event={"ID":"85415d6a-8a5f-4b65-b182-2bfe221e8eee","Type":"ContainerDied","Data":"d85afc401ad87104d844d4c1c5c56bfe2224eb996820680ca9a6f48ab88469e3"} Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.170555 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-570c-account-create-update-48hmw" event={"ID":"ccb316de-cd6e-4f79-9387-81f7a8add771","Type":"ContainerDied","Data":"199b20de65a928c87690cfbe6b6a25c5d3467e26b72add5ce1fab6172ff92b03"} Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.170587 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="199b20de65a928c87690cfbe6b6a25c5d3467e26b72add5ce1fab6172ff92b03" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.170644 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-570c-account-create-update-48hmw" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.174515 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-582lj" event={"ID":"1517f905-d980-43be-8583-f1a40170752e","Type":"ContainerDied","Data":"a5ffec33f37d9af4010f0839f331e730d0f7b8e15e52fe90f544650f10da490a"} Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.174543 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5ffec33f37d9af4010f0839f331e730d0f7b8e15e52fe90f544650f10da490a" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.174598 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-582lj" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.180813 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nn6tq" event={"ID":"5fa81aac-8f7a-4947-9fbe-c38851b3652e","Type":"ContainerDied","Data":"b39e64891804c87744085d0038f167adab37c91526911b1d675fc165390c9ae6"} Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.180849 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b39e64891804c87744085d0038f167adab37c91526911b1d675fc165390c9ae6" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.180922 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nn6tq" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.185891 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-6h6dp" event={"ID":"f3c65a30-a890-4d85-80ca-93f9420d5aa4","Type":"ContainerDied","Data":"7fc6124a6d90d9e051ea54c2356413344027c38ef2bbf16638c22f2aa3317a37"} Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.185898 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-6h6dp" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.185925 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7fc6124a6d90d9e051ea54c2356413344027c38ef2bbf16638c22f2aa3317a37" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.191494 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6388-account-create-update-skdjv" event={"ID":"92d9081e-1e94-4244-b66a-34b05bc98f2d","Type":"ContainerDied","Data":"0430e4597f7a0ce9032791766bb4dd9708a3c867ce5a3db25a833e2a1bde6abd"} Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.191522 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0430e4597f7a0ce9032791766bb4dd9708a3c867ce5a3db25a833e2a1bde6abd" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.191687 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6388-account-create-update-skdjv" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.204925 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjrg7\" (UniqueName: \"kubernetes.io/projected/92d9081e-1e94-4244-b66a-34b05bc98f2d-kube-api-access-mjrg7\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.204956 4804 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fa81aac-8f7a-4947-9fbe-c38851b3652e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.204966 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qt872\" (UniqueName: \"kubernetes.io/projected/3d23eb85-73ab-4049-b6be-486640c922e0-kube-api-access-qt872\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.204975 4804 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ccb316de-cd6e-4f79-9387-81f7a8add771-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.204985 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzbw2\" (UniqueName: \"kubernetes.io/projected/5fa81aac-8f7a-4947-9fbe-c38851b3652e-kube-api-access-zzbw2\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.205020 4804 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92d9081e-1e94-4244-b66a-34b05bc98f2d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.205029 4804 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3c65a30-a890-4d85-80ca-93f9420d5aa4-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.205038 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2rgm\" (UniqueName: \"kubernetes.io/projected/ccb316de-cd6e-4f79-9387-81f7a8add771-kube-api-access-p2rgm\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.205046 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mqt5\" (UniqueName: \"kubernetes.io/projected/f3c65a30-a890-4d85-80ca-93f9420d5aa4-kube-api-access-6mqt5\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.205055 4804 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d23eb85-73ab-4049-b6be-486640c922e0-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.207626 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc","Type":"ContainerStarted","Data":"b04453f65e9e684dd24047e6b46f78eee1f2d8e9656b8fe9c4e9c52c3026b270"} Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.213920 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2eb5-account-create-update-xv5m7" event={"ID":"3d23eb85-73ab-4049-b6be-486640c922e0","Type":"ContainerDied","Data":"c524172161ffac83a0b6e7a5805c119f237374e27cb6f6b470e9d29ed3840c55"} Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.213989 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c524172161ffac83a0b6e7a5805c119f237374e27cb6f6b470e9d29ed3840c55" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.213938 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2eb5-account-create-update-xv5m7" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.288940 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.406981 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85415d6a-8a5f-4b65-b182-2bfe221e8eee-logs\") pod \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.407144 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85415d6a-8a5f-4b65-b182-2bfe221e8eee-scripts\") pod \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.407227 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85415d6a-8a5f-4b65-b182-2bfe221e8eee-combined-ca-bundle\") pod \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.407272 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/85415d6a-8a5f-4b65-b182-2bfe221e8eee-horizon-tls-certs\") pod \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.407318 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/85415d6a-8a5f-4b65-b182-2bfe221e8eee-horizon-secret-key\") pod \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.407422 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wfg7\" (UniqueName: \"kubernetes.io/projected/85415d6a-8a5f-4b65-b182-2bfe221e8eee-kube-api-access-6wfg7\") pod \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.407511 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85415d6a-8a5f-4b65-b182-2bfe221e8eee-config-data\") pod \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\" (UID: \"85415d6a-8a5f-4b65-b182-2bfe221e8eee\") " Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.407588 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85415d6a-8a5f-4b65-b182-2bfe221e8eee-logs" (OuterVolumeSpecName: "logs") pod "85415d6a-8a5f-4b65-b182-2bfe221e8eee" (UID: "85415d6a-8a5f-4b65-b182-2bfe221e8eee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.408117 4804 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85415d6a-8a5f-4b65-b182-2bfe221e8eee-logs\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.414664 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85415d6a-8a5f-4b65-b182-2bfe221e8eee-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "85415d6a-8a5f-4b65-b182-2bfe221e8eee" (UID: "85415d6a-8a5f-4b65-b182-2bfe221e8eee"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.416284 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85415d6a-8a5f-4b65-b182-2bfe221e8eee-kube-api-access-6wfg7" (OuterVolumeSpecName: "kube-api-access-6wfg7") pod "85415d6a-8a5f-4b65-b182-2bfe221e8eee" (UID: "85415d6a-8a5f-4b65-b182-2bfe221e8eee"). InnerVolumeSpecName "kube-api-access-6wfg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.438937 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85415d6a-8a5f-4b65-b182-2bfe221e8eee-scripts" (OuterVolumeSpecName: "scripts") pod "85415d6a-8a5f-4b65-b182-2bfe221e8eee" (UID: "85415d6a-8a5f-4b65-b182-2bfe221e8eee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.452316 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85415d6a-8a5f-4b65-b182-2bfe221e8eee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85415d6a-8a5f-4b65-b182-2bfe221e8eee" (UID: "85415d6a-8a5f-4b65-b182-2bfe221e8eee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.469599 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85415d6a-8a5f-4b65-b182-2bfe221e8eee-config-data" (OuterVolumeSpecName: "config-data") pod "85415d6a-8a5f-4b65-b182-2bfe221e8eee" (UID: "85415d6a-8a5f-4b65-b182-2bfe221e8eee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.479571 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85415d6a-8a5f-4b65-b182-2bfe221e8eee-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "85415d6a-8a5f-4b65-b182-2bfe221e8eee" (UID: "85415d6a-8a5f-4b65-b182-2bfe221e8eee"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.510318 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85415d6a-8a5f-4b65-b182-2bfe221e8eee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.510357 4804 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/85415d6a-8a5f-4b65-b182-2bfe221e8eee-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.510371 4804 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/85415d6a-8a5f-4b65-b182-2bfe221e8eee-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.510383 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wfg7\" (UniqueName: \"kubernetes.io/projected/85415d6a-8a5f-4b65-b182-2bfe221e8eee-kube-api-access-6wfg7\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.510401 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85415d6a-8a5f-4b65-b182-2bfe221e8eee-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.510412 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85415d6a-8a5f-4b65-b182-2bfe221e8eee-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.913119 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:48:00 crc kubenswrapper[4804]: I0217 13:48:00.942005 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-59cfdfc65f-48l6n" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.152942 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.228078 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc","Type":"ContainerStarted","Data":"0fdb1bb4d257b541f4629651d861467c36e095d58a730ecfa35b94d6d1af66c6"} Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.229724 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" containerName="ceilometer-central-agent" containerID="cri-o://3c17e8a3044ba81c152d2c764a260f2c9dbcccc9c594584aa4f49d45c1b35256" gracePeriod=30 Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.229877 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.230055 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" containerName="proxy-httpd" containerID="cri-o://0fdb1bb4d257b541f4629651d861467c36e095d58a730ecfa35b94d6d1af66c6" gracePeriod=30 Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.230221 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" containerName="sg-core" containerID="cri-o://b04453f65e9e684dd24047e6b46f78eee1f2d8e9656b8fe9c4e9c52c3026b270" gracePeriod=30 Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.230272 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" containerName="ceilometer-notification-agent" containerID="cri-o://7760a459c274d77084f2e91289ebcf9b60005e0f3aedd7ef868b73f7e33c8293" gracePeriod=30 Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.240177 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58989b55cb-zjfvf" event={"ID":"85415d6a-8a5f-4b65-b182-2bfe221e8eee","Type":"ContainerDied","Data":"70f335cc0aa83fd894a693104e67ff9d41e07158faf0aa4fa4d67a39b59c2aa3"} Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.240927 4804 scope.go:117] "RemoveContainer" containerID="c565845aca9ef2b15231e4cf93626b2f7262c579528562e984d56c20dda93983" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.241091 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58989b55cb-zjfvf" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.248759 4804 generic.go:334] "Generic (PLEG): container finished" podID="a2f2352e-7e9b-439f-be3c-b48b70681658" containerID="8b150042424233c8bc124629fd2b08c59b6d968c4392bc674f15f7a3a798c1ef" exitCode=0 Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.249556 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-547f989fd6-rqkvc" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.249878 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-547f989fd6-rqkvc" event={"ID":"a2f2352e-7e9b-439f-be3c-b48b70681658","Type":"ContainerDied","Data":"8b150042424233c8bc124629fd2b08c59b6d968c4392bc674f15f7a3a798c1ef"} Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.249904 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-547f989fd6-rqkvc" event={"ID":"a2f2352e-7e9b-439f-be3c-b48b70681658","Type":"ContainerDied","Data":"f722d26b35de998b04775d73a392cd120313a641cde842ae74275d679995720d"} Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.265799 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.5663279660000002 podStartE2EDuration="9.265777123s" podCreationTimestamp="2026-02-17 13:47:52 +0000 UTC" firstStartedPulling="2026-02-17 13:47:53.983558937 +0000 UTC m=+1348.094978274" lastFinishedPulling="2026-02-17 13:48:00.683008094 +0000 UTC m=+1354.794427431" observedRunningTime="2026-02-17 13:48:01.255086387 +0000 UTC m=+1355.366505734" watchObservedRunningTime="2026-02-17 13:48:01.265777123 +0000 UTC m=+1355.377196470" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.278649 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-58989b55cb-zjfvf"] Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.295573 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-58989b55cb-zjfvf"] Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.332862 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-httpd-config\") pod \"a2f2352e-7e9b-439f-be3c-b48b70681658\" (UID: \"a2f2352e-7e9b-439f-be3c-b48b70681658\") " Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.333043 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-combined-ca-bundle\") pod \"a2f2352e-7e9b-439f-be3c-b48b70681658\" (UID: \"a2f2352e-7e9b-439f-be3c-b48b70681658\") " Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.333127 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x29pc\" (UniqueName: \"kubernetes.io/projected/a2f2352e-7e9b-439f-be3c-b48b70681658-kube-api-access-x29pc\") pod \"a2f2352e-7e9b-439f-be3c-b48b70681658\" (UID: \"a2f2352e-7e9b-439f-be3c-b48b70681658\") " Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.333182 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-ovndb-tls-certs\") pod \"a2f2352e-7e9b-439f-be3c-b48b70681658\" (UID: \"a2f2352e-7e9b-439f-be3c-b48b70681658\") " Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.333226 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-config\") pod \"a2f2352e-7e9b-439f-be3c-b48b70681658\" (UID: \"a2f2352e-7e9b-439f-be3c-b48b70681658\") " Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.337996 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2f2352e-7e9b-439f-be3c-b48b70681658-kube-api-access-x29pc" (OuterVolumeSpecName: "kube-api-access-x29pc") pod "a2f2352e-7e9b-439f-be3c-b48b70681658" (UID: "a2f2352e-7e9b-439f-be3c-b48b70681658"). InnerVolumeSpecName "kube-api-access-x29pc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.341805 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "a2f2352e-7e9b-439f-be3c-b48b70681658" (UID: "a2f2352e-7e9b-439f-be3c-b48b70681658"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.389582 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a2f2352e-7e9b-439f-be3c-b48b70681658" (UID: "a2f2352e-7e9b-439f-be3c-b48b70681658"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.398019 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-config" (OuterVolumeSpecName: "config") pod "a2f2352e-7e9b-439f-be3c-b48b70681658" (UID: "a2f2352e-7e9b-439f-be3c-b48b70681658"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.420289 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "a2f2352e-7e9b-439f-be3c-b48b70681658" (UID: "a2f2352e-7e9b-439f-be3c-b48b70681658"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.430043 4804 scope.go:117] "RemoveContainer" containerID="d85afc401ad87104d844d4c1c5c56bfe2224eb996820680ca9a6f48ab88469e3" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.435149 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.435186 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x29pc\" (UniqueName: \"kubernetes.io/projected/a2f2352e-7e9b-439f-be3c-b48b70681658-kube-api-access-x29pc\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.435220 4804 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.435234 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.435245 4804 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a2f2352e-7e9b-439f-be3c-b48b70681658-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.541128 4804 scope.go:117] "RemoveContainer" containerID="60e764f5ff10863b3c00346932550ad88cccb6cdd9323fc257402af064f3b3ad" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.616143 4804 scope.go:117] "RemoveContainer" containerID="8b150042424233c8bc124629fd2b08c59b6d968c4392bc674f15f7a3a798c1ef" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.623953 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-547f989fd6-rqkvc"] Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.636638 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-547f989fd6-rqkvc"] Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.637271 4804 scope.go:117] "RemoveContainer" containerID="60e764f5ff10863b3c00346932550ad88cccb6cdd9323fc257402af064f3b3ad" Feb 17 13:48:01 crc kubenswrapper[4804]: E0217 13:48:01.637773 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60e764f5ff10863b3c00346932550ad88cccb6cdd9323fc257402af064f3b3ad\": container with ID starting with 60e764f5ff10863b3c00346932550ad88cccb6cdd9323fc257402af064f3b3ad not found: ID does not exist" containerID="60e764f5ff10863b3c00346932550ad88cccb6cdd9323fc257402af064f3b3ad" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.637826 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60e764f5ff10863b3c00346932550ad88cccb6cdd9323fc257402af064f3b3ad"} err="failed to get container status \"60e764f5ff10863b3c00346932550ad88cccb6cdd9323fc257402af064f3b3ad\": rpc error: code = NotFound desc = could not find container \"60e764f5ff10863b3c00346932550ad88cccb6cdd9323fc257402af064f3b3ad\": container with ID starting with 60e764f5ff10863b3c00346932550ad88cccb6cdd9323fc257402af064f3b3ad not found: ID does not exist" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.637856 4804 scope.go:117] "RemoveContainer" containerID="8b150042424233c8bc124629fd2b08c59b6d968c4392bc674f15f7a3a798c1ef" Feb 17 13:48:01 crc kubenswrapper[4804]: E0217 13:48:01.638138 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b150042424233c8bc124629fd2b08c59b6d968c4392bc674f15f7a3a798c1ef\": container with ID starting with 8b150042424233c8bc124629fd2b08c59b6d968c4392bc674f15f7a3a798c1ef not found: ID does not exist" containerID="8b150042424233c8bc124629fd2b08c59b6d968c4392bc674f15f7a3a798c1ef" Feb 17 13:48:01 crc kubenswrapper[4804]: I0217 13:48:01.638247 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b150042424233c8bc124629fd2b08c59b6d968c4392bc674f15f7a3a798c1ef"} err="failed to get container status \"8b150042424233c8bc124629fd2b08c59b6d968c4392bc674f15f7a3a798c1ef\": rpc error: code = NotFound desc = could not find container \"8b150042424233c8bc124629fd2b08c59b6d968c4392bc674f15f7a3a798c1ef\": container with ID starting with 8b150042424233c8bc124629fd2b08c59b6d968c4392bc674f15f7a3a798c1ef not found: ID does not exist" Feb 17 13:48:02 crc kubenswrapper[4804]: I0217 13:48:02.259904 4804 generic.go:334] "Generic (PLEG): container finished" podID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" containerID="0fdb1bb4d257b541f4629651d861467c36e095d58a730ecfa35b94d6d1af66c6" exitCode=0 Feb 17 13:48:02 crc kubenswrapper[4804]: I0217 13:48:02.259939 4804 generic.go:334] "Generic (PLEG): container finished" podID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" containerID="b04453f65e9e684dd24047e6b46f78eee1f2d8e9656b8fe9c4e9c52c3026b270" exitCode=2 Feb 17 13:48:02 crc kubenswrapper[4804]: I0217 13:48:02.259947 4804 generic.go:334] "Generic (PLEG): container finished" podID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" containerID="7760a459c274d77084f2e91289ebcf9b60005e0f3aedd7ef868b73f7e33c8293" exitCode=0 Feb 17 13:48:02 crc kubenswrapper[4804]: I0217 13:48:02.259992 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc","Type":"ContainerDied","Data":"0fdb1bb4d257b541f4629651d861467c36e095d58a730ecfa35b94d6d1af66c6"} Feb 17 13:48:02 crc kubenswrapper[4804]: I0217 13:48:02.260049 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc","Type":"ContainerDied","Data":"b04453f65e9e684dd24047e6b46f78eee1f2d8e9656b8fe9c4e9c52c3026b270"} Feb 17 13:48:02 crc kubenswrapper[4804]: I0217 13:48:02.260064 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc","Type":"ContainerDied","Data":"7760a459c274d77084f2e91289ebcf9b60005e0f3aedd7ef868b73f7e33c8293"} Feb 17 13:48:02 crc kubenswrapper[4804]: I0217 13:48:02.586272 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85415d6a-8a5f-4b65-b182-2bfe221e8eee" path="/var/lib/kubelet/pods/85415d6a-8a5f-4b65-b182-2bfe221e8eee/volumes" Feb 17 13:48:02 crc kubenswrapper[4804]: I0217 13:48:02.587131 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2f2352e-7e9b-439f-be3c-b48b70681658" path="/var/lib/kubelet/pods/a2f2352e-7e9b-439f-be3c-b48b70681658/volumes" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.134258 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.274662 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-combined-ca-bundle\") pod \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.274705 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-scripts\") pod \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.275876 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-run-httpd\") pod \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.275942 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-sg-core-conf-yaml\") pod \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.276084 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-config-data\") pod \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.276151 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-log-httpd\") pod \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.276185 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr5gm\" (UniqueName: \"kubernetes.io/projected/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-kube-api-access-hr5gm\") pod \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\" (UID: \"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc\") " Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.276499 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" (UID: "ab10dbde-da5a-4d9d-ae66-5fcd9af104dc"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.277145 4804 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.277571 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" (UID: "ab10dbde-da5a-4d9d-ae66-5fcd9af104dc"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.284286 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-scripts" (OuterVolumeSpecName: "scripts") pod "ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" (UID: "ab10dbde-da5a-4d9d-ae66-5fcd9af104dc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.286145 4804 generic.go:334] "Generic (PLEG): container finished" podID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" containerID="3c17e8a3044ba81c152d2c764a260f2c9dbcccc9c594584aa4f49d45c1b35256" exitCode=0 Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.286192 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc","Type":"ContainerDied","Data":"3c17e8a3044ba81c152d2c764a260f2c9dbcccc9c594584aa4f49d45c1b35256"} Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.286243 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab10dbde-da5a-4d9d-ae66-5fcd9af104dc","Type":"ContainerDied","Data":"b3be3c859965ba94fbed2ed95fd8afc727f019eeb34bd5903d88d5dbf4d77e51"} Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.286264 4804 scope.go:117] "RemoveContainer" containerID="0fdb1bb4d257b541f4629651d861467c36e095d58a730ecfa35b94d6d1af66c6" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.286487 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.296798 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-kube-api-access-hr5gm" (OuterVolumeSpecName: "kube-api-access-hr5gm") pod "ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" (UID: "ab10dbde-da5a-4d9d-ae66-5fcd9af104dc"). InnerVolumeSpecName "kube-api-access-hr5gm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.300734 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" (UID: "ab10dbde-da5a-4d9d-ae66-5fcd9af104dc"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.343279 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" (UID: "ab10dbde-da5a-4d9d-ae66-5fcd9af104dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.379463 4804 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.379731 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hr5gm\" (UniqueName: \"kubernetes.io/projected/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-kube-api-access-hr5gm\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.379804 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.379874 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.379973 4804 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.391028 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-config-data" (OuterVolumeSpecName: "config-data") pod "ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" (UID: "ab10dbde-da5a-4d9d-ae66-5fcd9af104dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.418589 4804 scope.go:117] "RemoveContainer" containerID="b04453f65e9e684dd24047e6b46f78eee1f2d8e9656b8fe9c4e9c52c3026b270" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.454600 4804 scope.go:117] "RemoveContainer" containerID="7760a459c274d77084f2e91289ebcf9b60005e0f3aedd7ef868b73f7e33c8293" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.482163 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.496565 4804 scope.go:117] "RemoveContainer" containerID="3c17e8a3044ba81c152d2c764a260f2c9dbcccc9c594584aa4f49d45c1b35256" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.516614 4804 scope.go:117] "RemoveContainer" containerID="0fdb1bb4d257b541f4629651d861467c36e095d58a730ecfa35b94d6d1af66c6" Feb 17 13:48:03 crc kubenswrapper[4804]: E0217 13:48:03.517240 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fdb1bb4d257b541f4629651d861467c36e095d58a730ecfa35b94d6d1af66c6\": container with ID starting with 0fdb1bb4d257b541f4629651d861467c36e095d58a730ecfa35b94d6d1af66c6 not found: ID does not exist" containerID="0fdb1bb4d257b541f4629651d861467c36e095d58a730ecfa35b94d6d1af66c6" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.517276 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fdb1bb4d257b541f4629651d861467c36e095d58a730ecfa35b94d6d1af66c6"} err="failed to get container status \"0fdb1bb4d257b541f4629651d861467c36e095d58a730ecfa35b94d6d1af66c6\": rpc error: code = NotFound desc = could not find container \"0fdb1bb4d257b541f4629651d861467c36e095d58a730ecfa35b94d6d1af66c6\": container with ID starting with 0fdb1bb4d257b541f4629651d861467c36e095d58a730ecfa35b94d6d1af66c6 not found: ID does not exist" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.517330 4804 scope.go:117] "RemoveContainer" containerID="b04453f65e9e684dd24047e6b46f78eee1f2d8e9656b8fe9c4e9c52c3026b270" Feb 17 13:48:03 crc kubenswrapper[4804]: E0217 13:48:03.517626 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b04453f65e9e684dd24047e6b46f78eee1f2d8e9656b8fe9c4e9c52c3026b270\": container with ID starting with b04453f65e9e684dd24047e6b46f78eee1f2d8e9656b8fe9c4e9c52c3026b270 not found: ID does not exist" containerID="b04453f65e9e684dd24047e6b46f78eee1f2d8e9656b8fe9c4e9c52c3026b270" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.517673 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b04453f65e9e684dd24047e6b46f78eee1f2d8e9656b8fe9c4e9c52c3026b270"} err="failed to get container status \"b04453f65e9e684dd24047e6b46f78eee1f2d8e9656b8fe9c4e9c52c3026b270\": rpc error: code = NotFound desc = could not find container \"b04453f65e9e684dd24047e6b46f78eee1f2d8e9656b8fe9c4e9c52c3026b270\": container with ID starting with b04453f65e9e684dd24047e6b46f78eee1f2d8e9656b8fe9c4e9c52c3026b270 not found: ID does not exist" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.517692 4804 scope.go:117] "RemoveContainer" containerID="7760a459c274d77084f2e91289ebcf9b60005e0f3aedd7ef868b73f7e33c8293" Feb 17 13:48:03 crc kubenswrapper[4804]: E0217 13:48:03.517989 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7760a459c274d77084f2e91289ebcf9b60005e0f3aedd7ef868b73f7e33c8293\": container with ID starting with 7760a459c274d77084f2e91289ebcf9b60005e0f3aedd7ef868b73f7e33c8293 not found: ID does not exist" containerID="7760a459c274d77084f2e91289ebcf9b60005e0f3aedd7ef868b73f7e33c8293" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.518066 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7760a459c274d77084f2e91289ebcf9b60005e0f3aedd7ef868b73f7e33c8293"} err="failed to get container status \"7760a459c274d77084f2e91289ebcf9b60005e0f3aedd7ef868b73f7e33c8293\": rpc error: code = NotFound desc = could not find container \"7760a459c274d77084f2e91289ebcf9b60005e0f3aedd7ef868b73f7e33c8293\": container with ID starting with 7760a459c274d77084f2e91289ebcf9b60005e0f3aedd7ef868b73f7e33c8293 not found: ID does not exist" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.518086 4804 scope.go:117] "RemoveContainer" containerID="3c17e8a3044ba81c152d2c764a260f2c9dbcccc9c594584aa4f49d45c1b35256" Feb 17 13:48:03 crc kubenswrapper[4804]: E0217 13:48:03.518354 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c17e8a3044ba81c152d2c764a260f2c9dbcccc9c594584aa4f49d45c1b35256\": container with ID starting with 3c17e8a3044ba81c152d2c764a260f2c9dbcccc9c594584aa4f49d45c1b35256 not found: ID does not exist" containerID="3c17e8a3044ba81c152d2c764a260f2c9dbcccc9c594584aa4f49d45c1b35256" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.518383 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c17e8a3044ba81c152d2c764a260f2c9dbcccc9c594584aa4f49d45c1b35256"} err="failed to get container status \"3c17e8a3044ba81c152d2c764a260f2c9dbcccc9c594584aa4f49d45c1b35256\": rpc error: code = NotFound desc = could not find container \"3c17e8a3044ba81c152d2c764a260f2c9dbcccc9c594584aa4f49d45c1b35256\": container with ID starting with 3c17e8a3044ba81c152d2c764a260f2c9dbcccc9c594584aa4f49d45c1b35256 not found: ID does not exist" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.625027 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.632529 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.662292 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:48:03 crc kubenswrapper[4804]: E0217 13:48:03.662763 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" containerName="ceilometer-notification-agent" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.662780 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" containerName="ceilometer-notification-agent" Feb 17 13:48:03 crc kubenswrapper[4804]: E0217 13:48:03.662799 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3c65a30-a890-4d85-80ca-93f9420d5aa4" containerName="mariadb-database-create" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.662813 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3c65a30-a890-4d85-80ca-93f9420d5aa4" containerName="mariadb-database-create" Feb 17 13:48:03 crc kubenswrapper[4804]: E0217 13:48:03.662826 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d23eb85-73ab-4049-b6be-486640c922e0" containerName="mariadb-account-create-update" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.662832 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d23eb85-73ab-4049-b6be-486640c922e0" containerName="mariadb-account-create-update" Feb 17 13:48:03 crc kubenswrapper[4804]: E0217 13:48:03.662840 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fa81aac-8f7a-4947-9fbe-c38851b3652e" containerName="mariadb-database-create" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.662846 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fa81aac-8f7a-4947-9fbe-c38851b3652e" containerName="mariadb-database-create" Feb 17 13:48:03 crc kubenswrapper[4804]: E0217 13:48:03.662857 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccb316de-cd6e-4f79-9387-81f7a8add771" containerName="mariadb-account-create-update" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.662863 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccb316de-cd6e-4f79-9387-81f7a8add771" containerName="mariadb-account-create-update" Feb 17 13:48:03 crc kubenswrapper[4804]: E0217 13:48:03.662878 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" containerName="proxy-httpd" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.662883 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" containerName="proxy-httpd" Feb 17 13:48:03 crc kubenswrapper[4804]: E0217 13:48:03.662890 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85415d6a-8a5f-4b65-b182-2bfe221e8eee" containerName="horizon-log" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.662895 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="85415d6a-8a5f-4b65-b182-2bfe221e8eee" containerName="horizon-log" Feb 17 13:48:03 crc kubenswrapper[4804]: E0217 13:48:03.662910 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" containerName="sg-core" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.662915 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" containerName="sg-core" Feb 17 13:48:03 crc kubenswrapper[4804]: E0217 13:48:03.662924 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1517f905-d980-43be-8583-f1a40170752e" containerName="mariadb-database-create" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.662930 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="1517f905-d980-43be-8583-f1a40170752e" containerName="mariadb-database-create" Feb 17 13:48:03 crc kubenswrapper[4804]: E0217 13:48:03.662939 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92d9081e-1e94-4244-b66a-34b05bc98f2d" containerName="mariadb-account-create-update" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.662945 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="92d9081e-1e94-4244-b66a-34b05bc98f2d" containerName="mariadb-account-create-update" Feb 17 13:48:03 crc kubenswrapper[4804]: E0217 13:48:03.662956 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2f2352e-7e9b-439f-be3c-b48b70681658" containerName="neutron-httpd" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.662962 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2f2352e-7e9b-439f-be3c-b48b70681658" containerName="neutron-httpd" Feb 17 13:48:03 crc kubenswrapper[4804]: E0217 13:48:03.662972 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85415d6a-8a5f-4b65-b182-2bfe221e8eee" containerName="horizon" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.662977 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="85415d6a-8a5f-4b65-b182-2bfe221e8eee" containerName="horizon" Feb 17 13:48:03 crc kubenswrapper[4804]: E0217 13:48:03.662988 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2f2352e-7e9b-439f-be3c-b48b70681658" containerName="neutron-api" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.662994 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2f2352e-7e9b-439f-be3c-b48b70681658" containerName="neutron-api" Feb 17 13:48:03 crc kubenswrapper[4804]: E0217 13:48:03.663008 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" containerName="ceilometer-central-agent" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.663014 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" containerName="ceilometer-central-agent" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.663194 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" containerName="ceilometer-notification-agent" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.663224 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="1517f905-d980-43be-8583-f1a40170752e" containerName="mariadb-database-create" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.663236 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d23eb85-73ab-4049-b6be-486640c922e0" containerName="mariadb-account-create-update" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.663251 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccb316de-cd6e-4f79-9387-81f7a8add771" containerName="mariadb-account-create-update" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.663265 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="92d9081e-1e94-4244-b66a-34b05bc98f2d" containerName="mariadb-account-create-update" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.663275 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="85415d6a-8a5f-4b65-b182-2bfe221e8eee" containerName="horizon" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.663287 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2f2352e-7e9b-439f-be3c-b48b70681658" containerName="neutron-httpd" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.663298 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" containerName="sg-core" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.663312 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3c65a30-a890-4d85-80ca-93f9420d5aa4" containerName="mariadb-database-create" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.663327 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" containerName="proxy-httpd" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.663340 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" containerName="ceilometer-central-agent" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.663353 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2f2352e-7e9b-439f-be3c-b48b70681658" containerName="neutron-api" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.663366 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="85415d6a-8a5f-4b65-b182-2bfe221e8eee" containerName="horizon-log" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.663372 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fa81aac-8f7a-4947-9fbe-c38851b3652e" containerName="mariadb-database-create" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.669673 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.672906 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.673694 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.673891 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.787259 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-scripts\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.787339 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-config-data\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.787514 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-log-httpd\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.787654 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-run-httpd\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.787845 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kzxj\" (UniqueName: \"kubernetes.io/projected/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-kube-api-access-8kzxj\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.787933 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.788139 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.889509 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.889743 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-scripts\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.889792 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-config-data\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.889816 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-log-httpd\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.889841 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-run-httpd\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.889888 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kzxj\" (UniqueName: \"kubernetes.io/projected/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-kube-api-access-8kzxj\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.889909 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.890980 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-run-httpd\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.891029 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-log-httpd\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.894361 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.896184 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-config-data\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.902034 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-scripts\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.904066 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.908666 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kzxj\" (UniqueName: \"kubernetes.io/projected/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-kube-api-access-8kzxj\") pod \"ceilometer-0\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " pod="openstack/ceilometer-0" Feb 17 13:48:03 crc kubenswrapper[4804]: I0217 13:48:03.991022 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.353486 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ndx9s"] Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.355169 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ndx9s" Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.358257 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.358297 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-rxrxn" Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.358599 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.366472 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ndx9s"] Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.502252 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-config-data\") pod \"nova-cell0-conductor-db-sync-ndx9s\" (UID: \"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53\") " pod="openstack/nova-cell0-conductor-db-sync-ndx9s" Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.502312 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ndx9s\" (UID: \"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53\") " pod="openstack/nova-cell0-conductor-db-sync-ndx9s" Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.502338 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qltdm\" (UniqueName: \"kubernetes.io/projected/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-kube-api-access-qltdm\") pod \"nova-cell0-conductor-db-sync-ndx9s\" (UID: \"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53\") " pod="openstack/nova-cell0-conductor-db-sync-ndx9s" Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.502376 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-scripts\") pod \"nova-cell0-conductor-db-sync-ndx9s\" (UID: \"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53\") " pod="openstack/nova-cell0-conductor-db-sync-ndx9s" Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.512487 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:48:04 crc kubenswrapper[4804]: W0217 13:48:04.526090 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f98588b_4340_42cc_af47_1f1d5c0c6d0f.slice/crio-f9d41a7bd6eeacec0471c386bde6cc241d19539177f7e3af7b07ac298754582a WatchSource:0}: Error finding container f9d41a7bd6eeacec0471c386bde6cc241d19539177f7e3af7b07ac298754582a: Status 404 returned error can't find the container with id f9d41a7bd6eeacec0471c386bde6cc241d19539177f7e3af7b07ac298754582a Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.582964 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab10dbde-da5a-4d9d-ae66-5fcd9af104dc" path="/var/lib/kubelet/pods/ab10dbde-da5a-4d9d-ae66-5fcd9af104dc/volumes" Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.603840 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-config-data\") pod \"nova-cell0-conductor-db-sync-ndx9s\" (UID: \"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53\") " pod="openstack/nova-cell0-conductor-db-sync-ndx9s" Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.603896 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ndx9s\" (UID: \"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53\") " pod="openstack/nova-cell0-conductor-db-sync-ndx9s" Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.603922 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qltdm\" (UniqueName: \"kubernetes.io/projected/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-kube-api-access-qltdm\") pod \"nova-cell0-conductor-db-sync-ndx9s\" (UID: \"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53\") " pod="openstack/nova-cell0-conductor-db-sync-ndx9s" Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.603956 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-scripts\") pod \"nova-cell0-conductor-db-sync-ndx9s\" (UID: \"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53\") " pod="openstack/nova-cell0-conductor-db-sync-ndx9s" Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.610144 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ndx9s\" (UID: \"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53\") " pod="openstack/nova-cell0-conductor-db-sync-ndx9s" Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.610454 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-config-data\") pod \"nova-cell0-conductor-db-sync-ndx9s\" (UID: \"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53\") " pod="openstack/nova-cell0-conductor-db-sync-ndx9s" Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.610557 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-scripts\") pod \"nova-cell0-conductor-db-sync-ndx9s\" (UID: \"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53\") " pod="openstack/nova-cell0-conductor-db-sync-ndx9s" Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.619513 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qltdm\" (UniqueName: \"kubernetes.io/projected/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-kube-api-access-qltdm\") pod \"nova-cell0-conductor-db-sync-ndx9s\" (UID: \"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53\") " pod="openstack/nova-cell0-conductor-db-sync-ndx9s" Feb 17 13:48:04 crc kubenswrapper[4804]: I0217 13:48:04.678879 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ndx9s" Feb 17 13:48:05 crc kubenswrapper[4804]: I0217 13:48:05.136940 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ndx9s"] Feb 17 13:48:05 crc kubenswrapper[4804]: I0217 13:48:05.307384 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f98588b-4340-42cc-af47-1f1d5c0c6d0f","Type":"ContainerStarted","Data":"e05005cc28b8a5781793808dbb22035a5311dc438a83d9d65723b579e42deefb"} Feb 17 13:48:05 crc kubenswrapper[4804]: I0217 13:48:05.307433 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f98588b-4340-42cc-af47-1f1d5c0c6d0f","Type":"ContainerStarted","Data":"f9d41a7bd6eeacec0471c386bde6cc241d19539177f7e3af7b07ac298754582a"} Feb 17 13:48:05 crc kubenswrapper[4804]: I0217 13:48:05.308813 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ndx9s" event={"ID":"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53","Type":"ContainerStarted","Data":"0b89b75bf22a2ce250e51320c1e06b88aa347683ca582f8db810c8648296646c"} Feb 17 13:48:05 crc kubenswrapper[4804]: I0217 13:48:05.393402 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 17 13:48:05 crc kubenswrapper[4804]: I0217 13:48:05.393455 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 17 13:48:05 crc kubenswrapper[4804]: I0217 13:48:05.431959 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 17 13:48:05 crc kubenswrapper[4804]: I0217 13:48:05.444475 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 17 13:48:06 crc kubenswrapper[4804]: I0217 13:48:06.176471 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:48:06 crc kubenswrapper[4804]: I0217 13:48:06.334070 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f98588b-4340-42cc-af47-1f1d5c0c6d0f","Type":"ContainerStarted","Data":"706d2ec04359393f0bf73ae7699e4341238c751922d5d96808f68c292e5e4310"} Feb 17 13:48:06 crc kubenswrapper[4804]: I0217 13:48:06.334544 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 17 13:48:06 crc kubenswrapper[4804]: I0217 13:48:06.334591 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 17 13:48:06 crc kubenswrapper[4804]: I0217 13:48:06.416150 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 17 13:48:06 crc kubenswrapper[4804]: I0217 13:48:06.417417 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 17 13:48:06 crc kubenswrapper[4804]: I0217 13:48:06.466902 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 17 13:48:06 crc kubenswrapper[4804]: I0217 13:48:06.501858 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 17 13:48:07 crc kubenswrapper[4804]: I0217 13:48:07.347762 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f98588b-4340-42cc-af47-1f1d5c0c6d0f","Type":"ContainerStarted","Data":"4becf17ec69ec73b9e5515e279e099dcb395885b98691d86a0bf3f748a338115"} Feb 17 13:48:07 crc kubenswrapper[4804]: I0217 13:48:07.348138 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 17 13:48:07 crc kubenswrapper[4804]: I0217 13:48:07.348158 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 17 13:48:08 crc kubenswrapper[4804]: I0217 13:48:08.368817 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" containerName="ceilometer-central-agent" containerID="cri-o://e05005cc28b8a5781793808dbb22035a5311dc438a83d9d65723b579e42deefb" gracePeriod=30 Feb 17 13:48:08 crc kubenswrapper[4804]: I0217 13:48:08.369330 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f98588b-4340-42cc-af47-1f1d5c0c6d0f","Type":"ContainerStarted","Data":"8477f92c817b81c812d2b022582de811c55e1932bdd1e29d5f6293940edfafa2"} Feb 17 13:48:08 crc kubenswrapper[4804]: I0217 13:48:08.369372 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 13:48:08 crc kubenswrapper[4804]: I0217 13:48:08.369390 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" containerName="sg-core" containerID="cri-o://4becf17ec69ec73b9e5515e279e099dcb395885b98691d86a0bf3f748a338115" gracePeriod=30 Feb 17 13:48:08 crc kubenswrapper[4804]: I0217 13:48:08.369447 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" containerName="proxy-httpd" containerID="cri-o://8477f92c817b81c812d2b022582de811c55e1932bdd1e29d5f6293940edfafa2" gracePeriod=30 Feb 17 13:48:08 crc kubenswrapper[4804]: I0217 13:48:08.369415 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" containerName="ceilometer-notification-agent" containerID="cri-o://706d2ec04359393f0bf73ae7699e4341238c751922d5d96808f68c292e5e4310" gracePeriod=30 Feb 17 13:48:08 crc kubenswrapper[4804]: I0217 13:48:08.397185 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.141068828 podStartE2EDuration="5.397167779s" podCreationTimestamp="2026-02-17 13:48:03 +0000 UTC" firstStartedPulling="2026-02-17 13:48:04.528859463 +0000 UTC m=+1358.640278800" lastFinishedPulling="2026-02-17 13:48:07.784958414 +0000 UTC m=+1361.896377751" observedRunningTime="2026-02-17 13:48:08.393774452 +0000 UTC m=+1362.505193789" watchObservedRunningTime="2026-02-17 13:48:08.397167779 +0000 UTC m=+1362.508587116" Feb 17 13:48:08 crc kubenswrapper[4804]: I0217 13:48:08.552032 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 17 13:48:08 crc kubenswrapper[4804]: I0217 13:48:08.552782 4804 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 13:48:08 crc kubenswrapper[4804]: I0217 13:48:08.566566 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 17 13:48:09 crc kubenswrapper[4804]: I0217 13:48:09.405437 4804 generic.go:334] "Generic (PLEG): container finished" podID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" containerID="8477f92c817b81c812d2b022582de811c55e1932bdd1e29d5f6293940edfafa2" exitCode=0 Feb 17 13:48:09 crc kubenswrapper[4804]: I0217 13:48:09.405482 4804 generic.go:334] "Generic (PLEG): container finished" podID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" containerID="4becf17ec69ec73b9e5515e279e099dcb395885b98691d86a0bf3f748a338115" exitCode=2 Feb 17 13:48:09 crc kubenswrapper[4804]: I0217 13:48:09.405492 4804 generic.go:334] "Generic (PLEG): container finished" podID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" containerID="706d2ec04359393f0bf73ae7699e4341238c751922d5d96808f68c292e5e4310" exitCode=0 Feb 17 13:48:09 crc kubenswrapper[4804]: I0217 13:48:09.406404 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f98588b-4340-42cc-af47-1f1d5c0c6d0f","Type":"ContainerDied","Data":"8477f92c817b81c812d2b022582de811c55e1932bdd1e29d5f6293940edfafa2"} Feb 17 13:48:09 crc kubenswrapper[4804]: I0217 13:48:09.406490 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f98588b-4340-42cc-af47-1f1d5c0c6d0f","Type":"ContainerDied","Data":"4becf17ec69ec73b9e5515e279e099dcb395885b98691d86a0bf3f748a338115"} Feb 17 13:48:09 crc kubenswrapper[4804]: I0217 13:48:09.406507 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f98588b-4340-42cc-af47-1f1d5c0c6d0f","Type":"ContainerDied","Data":"706d2ec04359393f0bf73ae7699e4341238c751922d5d96808f68c292e5e4310"} Feb 17 13:48:09 crc kubenswrapper[4804]: I0217 13:48:09.951886 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 17 13:48:09 crc kubenswrapper[4804]: I0217 13:48:09.951988 4804 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 13:48:10 crc kubenswrapper[4804]: I0217 13:48:10.378747 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 17 13:48:16 crc kubenswrapper[4804]: I0217 13:48:16.503983 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ndx9s" event={"ID":"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53","Type":"ContainerStarted","Data":"fc70787b15c0217130d5a14ec0e9948f9e8203a3e166dda3f2555ad7e07ed729"} Feb 17 13:48:16 crc kubenswrapper[4804]: I0217 13:48:16.528917 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-ndx9s" podStartSLOduration=2.14612883 podStartE2EDuration="12.528900646s" podCreationTimestamp="2026-02-17 13:48:04 +0000 UTC" firstStartedPulling="2026-02-17 13:48:05.151126483 +0000 UTC m=+1359.262545830" lastFinishedPulling="2026-02-17 13:48:15.533898309 +0000 UTC m=+1369.645317646" observedRunningTime="2026-02-17 13:48:16.52392651 +0000 UTC m=+1370.635345857" watchObservedRunningTime="2026-02-17 13:48:16.528900646 +0000 UTC m=+1370.640319983" Feb 17 13:48:18 crc kubenswrapper[4804]: I0217 13:48:18.524380 4804 generic.go:334] "Generic (PLEG): container finished" podID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" containerID="e05005cc28b8a5781793808dbb22035a5311dc438a83d9d65723b579e42deefb" exitCode=0 Feb 17 13:48:18 crc kubenswrapper[4804]: I0217 13:48:18.524594 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f98588b-4340-42cc-af47-1f1d5c0c6d0f","Type":"ContainerDied","Data":"e05005cc28b8a5781793808dbb22035a5311dc438a83d9d65723b579e42deefb"} Feb 17 13:48:18 crc kubenswrapper[4804]: I0217 13:48:18.888085 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.001274 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-log-httpd\") pod \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.001356 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-config-data\") pod \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.001395 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-sg-core-conf-yaml\") pod \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.001484 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-combined-ca-bundle\") pod \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.001562 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kzxj\" (UniqueName: \"kubernetes.io/projected/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-kube-api-access-8kzxj\") pod \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.001599 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-scripts\") pod \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.001634 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-run-httpd\") pod \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\" (UID: \"2f98588b-4340-42cc-af47-1f1d5c0c6d0f\") " Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.002406 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2f98588b-4340-42cc-af47-1f1d5c0c6d0f" (UID: "2f98588b-4340-42cc-af47-1f1d5c0c6d0f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.002624 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2f98588b-4340-42cc-af47-1f1d5c0c6d0f" (UID: "2f98588b-4340-42cc-af47-1f1d5c0c6d0f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.007537 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-scripts" (OuterVolumeSpecName: "scripts") pod "2f98588b-4340-42cc-af47-1f1d5c0c6d0f" (UID: "2f98588b-4340-42cc-af47-1f1d5c0c6d0f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.020543 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-kube-api-access-8kzxj" (OuterVolumeSpecName: "kube-api-access-8kzxj") pod "2f98588b-4340-42cc-af47-1f1d5c0c6d0f" (UID: "2f98588b-4340-42cc-af47-1f1d5c0c6d0f"). InnerVolumeSpecName "kube-api-access-8kzxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.031429 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2f98588b-4340-42cc-af47-1f1d5c0c6d0f" (UID: "2f98588b-4340-42cc-af47-1f1d5c0c6d0f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.073485 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f98588b-4340-42cc-af47-1f1d5c0c6d0f" (UID: "2f98588b-4340-42cc-af47-1f1d5c0c6d0f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.103728 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.103760 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kzxj\" (UniqueName: \"kubernetes.io/projected/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-kube-api-access-8kzxj\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.103771 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.103780 4804 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.103788 4804 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.103798 4804 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.106607 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-config-data" (OuterVolumeSpecName: "config-data") pod "2f98588b-4340-42cc-af47-1f1d5c0c6d0f" (UID: "2f98588b-4340-42cc-af47-1f1d5c0c6d0f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.205645 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f98588b-4340-42cc-af47-1f1d5c0c6d0f-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.537373 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f98588b-4340-42cc-af47-1f1d5c0c6d0f","Type":"ContainerDied","Data":"f9d41a7bd6eeacec0471c386bde6cc241d19539177f7e3af7b07ac298754582a"} Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.537451 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.537805 4804 scope.go:117] "RemoveContainer" containerID="8477f92c817b81c812d2b022582de811c55e1932bdd1e29d5f6293940edfafa2" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.562844 4804 scope.go:117] "RemoveContainer" containerID="4becf17ec69ec73b9e5515e279e099dcb395885b98691d86a0bf3f748a338115" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.576077 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.584002 4804 scope.go:117] "RemoveContainer" containerID="706d2ec04359393f0bf73ae7699e4341238c751922d5d96808f68c292e5e4310" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.584263 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.625233 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:48:19 crc kubenswrapper[4804]: E0217 13:48:19.625667 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" containerName="ceilometer-notification-agent" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.625684 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" containerName="ceilometer-notification-agent" Feb 17 13:48:19 crc kubenswrapper[4804]: E0217 13:48:19.625710 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" containerName="ceilometer-central-agent" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.625731 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" containerName="ceilometer-central-agent" Feb 17 13:48:19 crc kubenswrapper[4804]: E0217 13:48:19.625753 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" containerName="proxy-httpd" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.625759 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" containerName="proxy-httpd" Feb 17 13:48:19 crc kubenswrapper[4804]: E0217 13:48:19.625778 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" containerName="sg-core" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.625785 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" containerName="sg-core" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.625960 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" containerName="proxy-httpd" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.625980 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" containerName="ceilometer-notification-agent" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.625998 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" containerName="ceilometer-central-agent" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.626009 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" containerName="sg-core" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.627956 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.628103 4804 scope.go:117] "RemoveContainer" containerID="e05005cc28b8a5781793808dbb22035a5311dc438a83d9d65723b579e42deefb" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.631116 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.631362 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.634622 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.714032 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4de89973-4899-493b-aacb-b8c3b5c96b5d-log-httpd\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.714096 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.714166 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.714617 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-scripts\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.714814 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-854dv\" (UniqueName: \"kubernetes.io/projected/4de89973-4899-493b-aacb-b8c3b5c96b5d-kube-api-access-854dv\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.714918 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4de89973-4899-493b-aacb-b8c3b5c96b5d-run-httpd\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.715045 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-config-data\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.816624 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-854dv\" (UniqueName: \"kubernetes.io/projected/4de89973-4899-493b-aacb-b8c3b5c96b5d-kube-api-access-854dv\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.816683 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4de89973-4899-493b-aacb-b8c3b5c96b5d-run-httpd\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.816728 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-config-data\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.816821 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4de89973-4899-493b-aacb-b8c3b5c96b5d-log-httpd\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.816847 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.816879 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.816930 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-scripts\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.817658 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4de89973-4899-493b-aacb-b8c3b5c96b5d-run-httpd\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.817856 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4de89973-4899-493b-aacb-b8c3b5c96b5d-log-httpd\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.822336 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-config-data\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.822795 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.833857 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-scripts\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.834175 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.843234 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-854dv\" (UniqueName: \"kubernetes.io/projected/4de89973-4899-493b-aacb-b8c3b5c96b5d-kube-api-access-854dv\") pod \"ceilometer-0\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " pod="openstack/ceilometer-0" Feb 17 13:48:19 crc kubenswrapper[4804]: I0217 13:48:19.956344 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:48:20 crc kubenswrapper[4804]: W0217 13:48:20.484090 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4de89973_4899_493b_aacb_b8c3b5c96b5d.slice/crio-34cf4653df9d04a4b888d17573f24b70b1233173f57dd3617dc2eff2bc8c163b WatchSource:0}: Error finding container 34cf4653df9d04a4b888d17573f24b70b1233173f57dd3617dc2eff2bc8c163b: Status 404 returned error can't find the container with id 34cf4653df9d04a4b888d17573f24b70b1233173f57dd3617dc2eff2bc8c163b Feb 17 13:48:20 crc kubenswrapper[4804]: I0217 13:48:20.484539 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:48:20 crc kubenswrapper[4804]: I0217 13:48:20.546550 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4de89973-4899-493b-aacb-b8c3b5c96b5d","Type":"ContainerStarted","Data":"34cf4653df9d04a4b888d17573f24b70b1233173f57dd3617dc2eff2bc8c163b"} Feb 17 13:48:20 crc kubenswrapper[4804]: I0217 13:48:20.584154 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f98588b-4340-42cc-af47-1f1d5c0c6d0f" path="/var/lib/kubelet/pods/2f98588b-4340-42cc-af47-1f1d5c0c6d0f/volumes" Feb 17 13:48:21 crc kubenswrapper[4804]: I0217 13:48:21.557922 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4de89973-4899-493b-aacb-b8c3b5c96b5d","Type":"ContainerStarted","Data":"86b0a242ba2aec826d8895f2a0fdd31305fd55e03bb556f3abf95a057a33d8c6"} Feb 17 13:48:22 crc kubenswrapper[4804]: I0217 13:48:22.571853 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4de89973-4899-493b-aacb-b8c3b5c96b5d","Type":"ContainerStarted","Data":"d027896f62a09ecf90b8c05201731b7d1aa3efa1ea9463626030d6c580314cab"} Feb 17 13:48:23 crc kubenswrapper[4804]: I0217 13:48:23.580960 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4de89973-4899-493b-aacb-b8c3b5c96b5d","Type":"ContainerStarted","Data":"76d07ecd0c3dee98bdf255c8b3e68aaf22b47767cc9601cc9638733e8c9d1e07"} Feb 17 13:48:24 crc kubenswrapper[4804]: I0217 13:48:24.145574 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:48:24 crc kubenswrapper[4804]: I0217 13:48:24.592781 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4de89973-4899-493b-aacb-b8c3b5c96b5d","Type":"ContainerStarted","Data":"138a258c6b0aa4915a9d16755f53cbffb72137ccc3a84d58203caf8aee963285"} Feb 17 13:48:24 crc kubenswrapper[4804]: I0217 13:48:24.593110 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 13:48:24 crc kubenswrapper[4804]: I0217 13:48:24.620340 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.248965435 podStartE2EDuration="5.620315749s" podCreationTimestamp="2026-02-17 13:48:19 +0000 UTC" firstStartedPulling="2026-02-17 13:48:20.486665673 +0000 UTC m=+1374.598085010" lastFinishedPulling="2026-02-17 13:48:23.858015977 +0000 UTC m=+1377.969435324" observedRunningTime="2026-02-17 13:48:24.612451272 +0000 UTC m=+1378.723870619" watchObservedRunningTime="2026-02-17 13:48:24.620315749 +0000 UTC m=+1378.731735086" Feb 17 13:48:25 crc kubenswrapper[4804]: I0217 13:48:25.606447 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4de89973-4899-493b-aacb-b8c3b5c96b5d" containerName="ceilometer-central-agent" containerID="cri-o://86b0a242ba2aec826d8895f2a0fdd31305fd55e03bb556f3abf95a057a33d8c6" gracePeriod=30 Feb 17 13:48:25 crc kubenswrapper[4804]: I0217 13:48:25.607455 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4de89973-4899-493b-aacb-b8c3b5c96b5d" containerName="sg-core" containerID="cri-o://76d07ecd0c3dee98bdf255c8b3e68aaf22b47767cc9601cc9638733e8c9d1e07" gracePeriod=30 Feb 17 13:48:25 crc kubenswrapper[4804]: I0217 13:48:25.607598 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4de89973-4899-493b-aacb-b8c3b5c96b5d" containerName="proxy-httpd" containerID="cri-o://138a258c6b0aa4915a9d16755f53cbffb72137ccc3a84d58203caf8aee963285" gracePeriod=30 Feb 17 13:48:25 crc kubenswrapper[4804]: I0217 13:48:25.607672 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4de89973-4899-493b-aacb-b8c3b5c96b5d" containerName="ceilometer-notification-agent" containerID="cri-o://d027896f62a09ecf90b8c05201731b7d1aa3efa1ea9463626030d6c580314cab" gracePeriod=30 Feb 17 13:48:26 crc kubenswrapper[4804]: I0217 13:48:26.617890 4804 generic.go:334] "Generic (PLEG): container finished" podID="20c077b5-d559-4c19-b8ee-f1b7ebf3fc53" containerID="fc70787b15c0217130d5a14ec0e9948f9e8203a3e166dda3f2555ad7e07ed729" exitCode=0 Feb 17 13:48:26 crc kubenswrapper[4804]: I0217 13:48:26.618098 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ndx9s" event={"ID":"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53","Type":"ContainerDied","Data":"fc70787b15c0217130d5a14ec0e9948f9e8203a3e166dda3f2555ad7e07ed729"} Feb 17 13:48:26 crc kubenswrapper[4804]: I0217 13:48:26.623989 4804 generic.go:334] "Generic (PLEG): container finished" podID="4de89973-4899-493b-aacb-b8c3b5c96b5d" containerID="138a258c6b0aa4915a9d16755f53cbffb72137ccc3a84d58203caf8aee963285" exitCode=0 Feb 17 13:48:26 crc kubenswrapper[4804]: I0217 13:48:26.624031 4804 generic.go:334] "Generic (PLEG): container finished" podID="4de89973-4899-493b-aacb-b8c3b5c96b5d" containerID="76d07ecd0c3dee98bdf255c8b3e68aaf22b47767cc9601cc9638733e8c9d1e07" exitCode=2 Feb 17 13:48:26 crc kubenswrapper[4804]: I0217 13:48:26.624047 4804 generic.go:334] "Generic (PLEG): container finished" podID="4de89973-4899-493b-aacb-b8c3b5c96b5d" containerID="d027896f62a09ecf90b8c05201731b7d1aa3efa1ea9463626030d6c580314cab" exitCode=0 Feb 17 13:48:26 crc kubenswrapper[4804]: I0217 13:48:26.624023 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4de89973-4899-493b-aacb-b8c3b5c96b5d","Type":"ContainerDied","Data":"138a258c6b0aa4915a9d16755f53cbffb72137ccc3a84d58203caf8aee963285"} Feb 17 13:48:26 crc kubenswrapper[4804]: I0217 13:48:26.624130 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4de89973-4899-493b-aacb-b8c3b5c96b5d","Type":"ContainerDied","Data":"76d07ecd0c3dee98bdf255c8b3e68aaf22b47767cc9601cc9638733e8c9d1e07"} Feb 17 13:48:26 crc kubenswrapper[4804]: I0217 13:48:26.624167 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4de89973-4899-493b-aacb-b8c3b5c96b5d","Type":"ContainerDied","Data":"d027896f62a09ecf90b8c05201731b7d1aa3efa1ea9463626030d6c580314cab"} Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.022278 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ndx9s" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.067436 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qltdm\" (UniqueName: \"kubernetes.io/projected/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-kube-api-access-qltdm\") pod \"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53\" (UID: \"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53\") " Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.067583 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-combined-ca-bundle\") pod \"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53\" (UID: \"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53\") " Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.067675 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-scripts\") pod \"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53\" (UID: \"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53\") " Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.067784 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-config-data\") pod \"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53\" (UID: \"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53\") " Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.084467 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-kube-api-access-qltdm" (OuterVolumeSpecName: "kube-api-access-qltdm") pod "20c077b5-d559-4c19-b8ee-f1b7ebf3fc53" (UID: "20c077b5-d559-4c19-b8ee-f1b7ebf3fc53"). InnerVolumeSpecName "kube-api-access-qltdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.095146 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-scripts" (OuterVolumeSpecName: "scripts") pod "20c077b5-d559-4c19-b8ee-f1b7ebf3fc53" (UID: "20c077b5-d559-4c19-b8ee-f1b7ebf3fc53"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.099044 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-config-data" (OuterVolumeSpecName: "config-data") pod "20c077b5-d559-4c19-b8ee-f1b7ebf3fc53" (UID: "20c077b5-d559-4c19-b8ee-f1b7ebf3fc53"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.105776 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20c077b5-d559-4c19-b8ee-f1b7ebf3fc53" (UID: "20c077b5-d559-4c19-b8ee-f1b7ebf3fc53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.169670 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.169724 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qltdm\" (UniqueName: \"kubernetes.io/projected/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-kube-api-access-qltdm\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.169748 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.169764 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.649281 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ndx9s" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.649473 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ndx9s" event={"ID":"20c077b5-d559-4c19-b8ee-f1b7ebf3fc53","Type":"ContainerDied","Data":"0b89b75bf22a2ce250e51320c1e06b88aa347683ca582f8db810c8648296646c"} Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.649655 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b89b75bf22a2ce250e51320c1e06b88aa347683ca582f8db810c8648296646c" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.722188 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 13:48:28 crc kubenswrapper[4804]: E0217 13:48:28.722698 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20c077b5-d559-4c19-b8ee-f1b7ebf3fc53" containerName="nova-cell0-conductor-db-sync" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.722724 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="20c077b5-d559-4c19-b8ee-f1b7ebf3fc53" containerName="nova-cell0-conductor-db-sync" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.722960 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="20c077b5-d559-4c19-b8ee-f1b7ebf3fc53" containerName="nova-cell0-conductor-db-sync" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.723738 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.727940 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-rxrxn" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.728994 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.734979 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.781446 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc78e86d-494e-417b-8569-b564cdbd069a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fc78e86d-494e-417b-8569-b564cdbd069a\") " pod="openstack/nova-cell0-conductor-0" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.781640 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtz2x\" (UniqueName: \"kubernetes.io/projected/fc78e86d-494e-417b-8569-b564cdbd069a-kube-api-access-mtz2x\") pod \"nova-cell0-conductor-0\" (UID: \"fc78e86d-494e-417b-8569-b564cdbd069a\") " pod="openstack/nova-cell0-conductor-0" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.781796 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc78e86d-494e-417b-8569-b564cdbd069a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fc78e86d-494e-417b-8569-b564cdbd069a\") " pod="openstack/nova-cell0-conductor-0" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.883982 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtz2x\" (UniqueName: \"kubernetes.io/projected/fc78e86d-494e-417b-8569-b564cdbd069a-kube-api-access-mtz2x\") pod \"nova-cell0-conductor-0\" (UID: \"fc78e86d-494e-417b-8569-b564cdbd069a\") " pod="openstack/nova-cell0-conductor-0" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.884089 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc78e86d-494e-417b-8569-b564cdbd069a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fc78e86d-494e-417b-8569-b564cdbd069a\") " pod="openstack/nova-cell0-conductor-0" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.884171 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc78e86d-494e-417b-8569-b564cdbd069a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fc78e86d-494e-417b-8569-b564cdbd069a\") " pod="openstack/nova-cell0-conductor-0" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.888061 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc78e86d-494e-417b-8569-b564cdbd069a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fc78e86d-494e-417b-8569-b564cdbd069a\") " pod="openstack/nova-cell0-conductor-0" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.889462 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc78e86d-494e-417b-8569-b564cdbd069a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fc78e86d-494e-417b-8569-b564cdbd069a\") " pod="openstack/nova-cell0-conductor-0" Feb 17 13:48:28 crc kubenswrapper[4804]: I0217 13:48:28.903975 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtz2x\" (UniqueName: \"kubernetes.io/projected/fc78e86d-494e-417b-8569-b564cdbd069a-kube-api-access-mtz2x\") pod \"nova-cell0-conductor-0\" (UID: \"fc78e86d-494e-417b-8569-b564cdbd069a\") " pod="openstack/nova-cell0-conductor-0" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.041582 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.499778 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 13:48:29 crc kubenswrapper[4804]: W0217 13:48:29.510027 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc78e86d_494e_417b_8569_b564cdbd069a.slice/crio-2de277511dd91bf90e302e0eb111d11f5bcbb32d3c5a517d71ff8e3ffc46d240 WatchSource:0}: Error finding container 2de277511dd91bf90e302e0eb111d11f5bcbb32d3c5a517d71ff8e3ffc46d240: Status 404 returned error can't find the container with id 2de277511dd91bf90e302e0eb111d11f5bcbb32d3c5a517d71ff8e3ffc46d240 Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.546713 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.596673 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-sg-core-conf-yaml\") pod \"4de89973-4899-493b-aacb-b8c3b5c96b5d\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.596811 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-combined-ca-bundle\") pod \"4de89973-4899-493b-aacb-b8c3b5c96b5d\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.596847 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-854dv\" (UniqueName: \"kubernetes.io/projected/4de89973-4899-493b-aacb-b8c3b5c96b5d-kube-api-access-854dv\") pod \"4de89973-4899-493b-aacb-b8c3b5c96b5d\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.596892 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-scripts\") pod \"4de89973-4899-493b-aacb-b8c3b5c96b5d\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.596946 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4de89973-4899-493b-aacb-b8c3b5c96b5d-run-httpd\") pod \"4de89973-4899-493b-aacb-b8c3b5c96b5d\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.596991 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-config-data\") pod \"4de89973-4899-493b-aacb-b8c3b5c96b5d\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.597019 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4de89973-4899-493b-aacb-b8c3b5c96b5d-log-httpd\") pod \"4de89973-4899-493b-aacb-b8c3b5c96b5d\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.598318 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4de89973-4899-493b-aacb-b8c3b5c96b5d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4de89973-4899-493b-aacb-b8c3b5c96b5d" (UID: "4de89973-4899-493b-aacb-b8c3b5c96b5d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.598354 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4de89973-4899-493b-aacb-b8c3b5c96b5d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4de89973-4899-493b-aacb-b8c3b5c96b5d" (UID: "4de89973-4899-493b-aacb-b8c3b5c96b5d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.600510 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-scripts" (OuterVolumeSpecName: "scripts") pod "4de89973-4899-493b-aacb-b8c3b5c96b5d" (UID: "4de89973-4899-493b-aacb-b8c3b5c96b5d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.618528 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4de89973-4899-493b-aacb-b8c3b5c96b5d-kube-api-access-854dv" (OuterVolumeSpecName: "kube-api-access-854dv") pod "4de89973-4899-493b-aacb-b8c3b5c96b5d" (UID: "4de89973-4899-493b-aacb-b8c3b5c96b5d"). InnerVolumeSpecName "kube-api-access-854dv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.668429 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"fc78e86d-494e-417b-8569-b564cdbd069a","Type":"ContainerStarted","Data":"2de277511dd91bf90e302e0eb111d11f5bcbb32d3c5a517d71ff8e3ffc46d240"} Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.672166 4804 generic.go:334] "Generic (PLEG): container finished" podID="4de89973-4899-493b-aacb-b8c3b5c96b5d" containerID="86b0a242ba2aec826d8895f2a0fdd31305fd55e03bb556f3abf95a057a33d8c6" exitCode=0 Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.672218 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4de89973-4899-493b-aacb-b8c3b5c96b5d","Type":"ContainerDied","Data":"86b0a242ba2aec826d8895f2a0fdd31305fd55e03bb556f3abf95a057a33d8c6"} Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.672244 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4de89973-4899-493b-aacb-b8c3b5c96b5d","Type":"ContainerDied","Data":"34cf4653df9d04a4b888d17573f24b70b1233173f57dd3617dc2eff2bc8c163b"} Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.672284 4804 scope.go:117] "RemoveContainer" containerID="138a258c6b0aa4915a9d16755f53cbffb72137ccc3a84d58203caf8aee963285" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.672332 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.694852 4804 scope.go:117] "RemoveContainer" containerID="76d07ecd0c3dee98bdf255c8b3e68aaf22b47767cc9601cc9638733e8c9d1e07" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.698512 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4de89973-4899-493b-aacb-b8c3b5c96b5d" (UID: "4de89973-4899-493b-aacb-b8c3b5c96b5d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.698639 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-sg-core-conf-yaml\") pod \"4de89973-4899-493b-aacb-b8c3b5c96b5d\" (UID: \"4de89973-4899-493b-aacb-b8c3b5c96b5d\") " Feb 17 13:48:29 crc kubenswrapper[4804]: W0217 13:48:29.698764 4804 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/4de89973-4899-493b-aacb-b8c3b5c96b5d/volumes/kubernetes.io~secret/sg-core-conf-yaml Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.698783 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4de89973-4899-493b-aacb-b8c3b5c96b5d" (UID: "4de89973-4899-493b-aacb-b8c3b5c96b5d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.699098 4804 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.699119 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-854dv\" (UniqueName: \"kubernetes.io/projected/4de89973-4899-493b-aacb-b8c3b5c96b5d-kube-api-access-854dv\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.699131 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.699141 4804 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4de89973-4899-493b-aacb-b8c3b5c96b5d-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.699148 4804 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4de89973-4899-493b-aacb-b8c3b5c96b5d-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.701075 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4de89973-4899-493b-aacb-b8c3b5c96b5d" (UID: "4de89973-4899-493b-aacb-b8c3b5c96b5d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.715501 4804 scope.go:117] "RemoveContainer" containerID="d027896f62a09ecf90b8c05201731b7d1aa3efa1ea9463626030d6c580314cab" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.736354 4804 scope.go:117] "RemoveContainer" containerID="86b0a242ba2aec826d8895f2a0fdd31305fd55e03bb556f3abf95a057a33d8c6" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.737148 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-config-data" (OuterVolumeSpecName: "config-data") pod "4de89973-4899-493b-aacb-b8c3b5c96b5d" (UID: "4de89973-4899-493b-aacb-b8c3b5c96b5d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.759330 4804 scope.go:117] "RemoveContainer" containerID="138a258c6b0aa4915a9d16755f53cbffb72137ccc3a84d58203caf8aee963285" Feb 17 13:48:29 crc kubenswrapper[4804]: E0217 13:48:29.759726 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"138a258c6b0aa4915a9d16755f53cbffb72137ccc3a84d58203caf8aee963285\": container with ID starting with 138a258c6b0aa4915a9d16755f53cbffb72137ccc3a84d58203caf8aee963285 not found: ID does not exist" containerID="138a258c6b0aa4915a9d16755f53cbffb72137ccc3a84d58203caf8aee963285" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.759756 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"138a258c6b0aa4915a9d16755f53cbffb72137ccc3a84d58203caf8aee963285"} err="failed to get container status \"138a258c6b0aa4915a9d16755f53cbffb72137ccc3a84d58203caf8aee963285\": rpc error: code = NotFound desc = could not find container \"138a258c6b0aa4915a9d16755f53cbffb72137ccc3a84d58203caf8aee963285\": container with ID starting with 138a258c6b0aa4915a9d16755f53cbffb72137ccc3a84d58203caf8aee963285 not found: ID does not exist" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.759777 4804 scope.go:117] "RemoveContainer" containerID="76d07ecd0c3dee98bdf255c8b3e68aaf22b47767cc9601cc9638733e8c9d1e07" Feb 17 13:48:29 crc kubenswrapper[4804]: E0217 13:48:29.760021 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76d07ecd0c3dee98bdf255c8b3e68aaf22b47767cc9601cc9638733e8c9d1e07\": container with ID starting with 76d07ecd0c3dee98bdf255c8b3e68aaf22b47767cc9601cc9638733e8c9d1e07 not found: ID does not exist" containerID="76d07ecd0c3dee98bdf255c8b3e68aaf22b47767cc9601cc9638733e8c9d1e07" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.760044 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76d07ecd0c3dee98bdf255c8b3e68aaf22b47767cc9601cc9638733e8c9d1e07"} err="failed to get container status \"76d07ecd0c3dee98bdf255c8b3e68aaf22b47767cc9601cc9638733e8c9d1e07\": rpc error: code = NotFound desc = could not find container \"76d07ecd0c3dee98bdf255c8b3e68aaf22b47767cc9601cc9638733e8c9d1e07\": container with ID starting with 76d07ecd0c3dee98bdf255c8b3e68aaf22b47767cc9601cc9638733e8c9d1e07 not found: ID does not exist" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.760057 4804 scope.go:117] "RemoveContainer" containerID="d027896f62a09ecf90b8c05201731b7d1aa3efa1ea9463626030d6c580314cab" Feb 17 13:48:29 crc kubenswrapper[4804]: E0217 13:48:29.760476 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d027896f62a09ecf90b8c05201731b7d1aa3efa1ea9463626030d6c580314cab\": container with ID starting with d027896f62a09ecf90b8c05201731b7d1aa3efa1ea9463626030d6c580314cab not found: ID does not exist" containerID="d027896f62a09ecf90b8c05201731b7d1aa3efa1ea9463626030d6c580314cab" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.760510 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d027896f62a09ecf90b8c05201731b7d1aa3efa1ea9463626030d6c580314cab"} err="failed to get container status \"d027896f62a09ecf90b8c05201731b7d1aa3efa1ea9463626030d6c580314cab\": rpc error: code = NotFound desc = could not find container \"d027896f62a09ecf90b8c05201731b7d1aa3efa1ea9463626030d6c580314cab\": container with ID starting with d027896f62a09ecf90b8c05201731b7d1aa3efa1ea9463626030d6c580314cab not found: ID does not exist" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.760523 4804 scope.go:117] "RemoveContainer" containerID="86b0a242ba2aec826d8895f2a0fdd31305fd55e03bb556f3abf95a057a33d8c6" Feb 17 13:48:29 crc kubenswrapper[4804]: E0217 13:48:29.760891 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86b0a242ba2aec826d8895f2a0fdd31305fd55e03bb556f3abf95a057a33d8c6\": container with ID starting with 86b0a242ba2aec826d8895f2a0fdd31305fd55e03bb556f3abf95a057a33d8c6 not found: ID does not exist" containerID="86b0a242ba2aec826d8895f2a0fdd31305fd55e03bb556f3abf95a057a33d8c6" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.760947 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86b0a242ba2aec826d8895f2a0fdd31305fd55e03bb556f3abf95a057a33d8c6"} err="failed to get container status \"86b0a242ba2aec826d8895f2a0fdd31305fd55e03bb556f3abf95a057a33d8c6\": rpc error: code = NotFound desc = could not find container \"86b0a242ba2aec826d8895f2a0fdd31305fd55e03bb556f3abf95a057a33d8c6\": container with ID starting with 86b0a242ba2aec826d8895f2a0fdd31305fd55e03bb556f3abf95a057a33d8c6 not found: ID does not exist" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.801036 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:29 crc kubenswrapper[4804]: I0217 13:48:29.801064 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4de89973-4899-493b-aacb-b8c3b5c96b5d-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.025181 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.033349 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.048367 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:48:30 crc kubenswrapper[4804]: E0217 13:48:30.048726 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4de89973-4899-493b-aacb-b8c3b5c96b5d" containerName="ceilometer-central-agent" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.048739 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="4de89973-4899-493b-aacb-b8c3b5c96b5d" containerName="ceilometer-central-agent" Feb 17 13:48:30 crc kubenswrapper[4804]: E0217 13:48:30.048758 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4de89973-4899-493b-aacb-b8c3b5c96b5d" containerName="sg-core" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.048765 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="4de89973-4899-493b-aacb-b8c3b5c96b5d" containerName="sg-core" Feb 17 13:48:30 crc kubenswrapper[4804]: E0217 13:48:30.048773 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4de89973-4899-493b-aacb-b8c3b5c96b5d" containerName="ceilometer-notification-agent" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.048780 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="4de89973-4899-493b-aacb-b8c3b5c96b5d" containerName="ceilometer-notification-agent" Feb 17 13:48:30 crc kubenswrapper[4804]: E0217 13:48:30.048787 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4de89973-4899-493b-aacb-b8c3b5c96b5d" containerName="proxy-httpd" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.048793 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="4de89973-4899-493b-aacb-b8c3b5c96b5d" containerName="proxy-httpd" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.048978 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="4de89973-4899-493b-aacb-b8c3b5c96b5d" containerName="proxy-httpd" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.048993 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="4de89973-4899-493b-aacb-b8c3b5c96b5d" containerName="sg-core" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.049005 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="4de89973-4899-493b-aacb-b8c3b5c96b5d" containerName="ceilometer-notification-agent" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.049015 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="4de89973-4899-493b-aacb-b8c3b5c96b5d" containerName="ceilometer-central-agent" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.050924 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.054492 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.054798 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.066118 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.106667 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-scripts\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.106729 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-config-data\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.106753 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.106795 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e6284b7-c2bf-491d-a8b8-66390efc3657-run-httpd\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.106816 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.106861 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e6284b7-c2bf-491d-a8b8-66390efc3657-log-httpd\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.106898 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l9t6\" (UniqueName: \"kubernetes.io/projected/0e6284b7-c2bf-491d-a8b8-66390efc3657-kube-api-access-5l9t6\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.208447 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-config-data\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.208492 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.208546 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e6284b7-c2bf-491d-a8b8-66390efc3657-run-httpd\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.208584 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.208618 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e6284b7-c2bf-491d-a8b8-66390efc3657-log-httpd\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.208679 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l9t6\" (UniqueName: \"kubernetes.io/projected/0e6284b7-c2bf-491d-a8b8-66390efc3657-kube-api-access-5l9t6\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.208740 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-scripts\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.209062 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e6284b7-c2bf-491d-a8b8-66390efc3657-run-httpd\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.209463 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e6284b7-c2bf-491d-a8b8-66390efc3657-log-httpd\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.213106 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.213624 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-config-data\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.214240 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-scripts\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.217976 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.231279 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l9t6\" (UniqueName: \"kubernetes.io/projected/0e6284b7-c2bf-491d-a8b8-66390efc3657-kube-api-access-5l9t6\") pod \"ceilometer-0\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.367514 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.583612 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4de89973-4899-493b-aacb-b8c3b5c96b5d" path="/var/lib/kubelet/pods/4de89973-4899-493b-aacb-b8c3b5c96b5d/volumes" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.686347 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"fc78e86d-494e-417b-8569-b564cdbd069a","Type":"ContainerStarted","Data":"5a9971c09b621119088ea175c0c01a351ea9bd051bc8add3c4b17f4c234c4088"} Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.686433 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.700042 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.700027236 podStartE2EDuration="2.700027236s" podCreationTimestamp="2026-02-17 13:48:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:48:30.699373235 +0000 UTC m=+1384.810792572" watchObservedRunningTime="2026-02-17 13:48:30.700027236 +0000 UTC m=+1384.811446573" Feb 17 13:48:30 crc kubenswrapper[4804]: I0217 13:48:30.849303 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:48:30 crc kubenswrapper[4804]: W0217 13:48:30.851472 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e6284b7_c2bf_491d_a8b8_66390efc3657.slice/crio-2d2e5d5016d0e7547bab751744c5123f906da3f81613bad821f32b24482acee8 WatchSource:0}: Error finding container 2d2e5d5016d0e7547bab751744c5123f906da3f81613bad821f32b24482acee8: Status 404 returned error can't find the container with id 2d2e5d5016d0e7547bab751744c5123f906da3f81613bad821f32b24482acee8 Feb 17 13:48:31 crc kubenswrapper[4804]: I0217 13:48:31.705305 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e6284b7-c2bf-491d-a8b8-66390efc3657","Type":"ContainerStarted","Data":"7ec1c31be2fad580ed205d1e9c083f2d309329aa6efc4e37d932dccebcd4de37"} Feb 17 13:48:31 crc kubenswrapper[4804]: I0217 13:48:31.706457 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e6284b7-c2bf-491d-a8b8-66390efc3657","Type":"ContainerStarted","Data":"2d2e5d5016d0e7547bab751744c5123f906da3f81613bad821f32b24482acee8"} Feb 17 13:48:32 crc kubenswrapper[4804]: I0217 13:48:32.713679 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e6284b7-c2bf-491d-a8b8-66390efc3657","Type":"ContainerStarted","Data":"c4c58d3303344b37b7dc7ae819f9a51ca95e884e972546633bf78895d550d607"} Feb 17 13:48:33 crc kubenswrapper[4804]: I0217 13:48:33.725315 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e6284b7-c2bf-491d-a8b8-66390efc3657","Type":"ContainerStarted","Data":"626e902362b6353811b54bed35993141950fc6ad10f5cc8f6cea6b4259262e10"} Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.070413 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.590697 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-pmp8r"] Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.592388 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pmp8r" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.598045 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.598256 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.600713 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-pmp8r"] Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.715891 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8hvb\" (UniqueName: \"kubernetes.io/projected/6597adc7-fdae-4de0-99bc-87d9807f38f4-kube-api-access-j8hvb\") pod \"nova-cell0-cell-mapping-pmp8r\" (UID: \"6597adc7-fdae-4de0-99bc-87d9807f38f4\") " pod="openstack/nova-cell0-cell-mapping-pmp8r" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.716267 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6597adc7-fdae-4de0-99bc-87d9807f38f4-config-data\") pod \"nova-cell0-cell-mapping-pmp8r\" (UID: \"6597adc7-fdae-4de0-99bc-87d9807f38f4\") " pod="openstack/nova-cell0-cell-mapping-pmp8r" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.716324 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6597adc7-fdae-4de0-99bc-87d9807f38f4-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pmp8r\" (UID: \"6597adc7-fdae-4de0-99bc-87d9807f38f4\") " pod="openstack/nova-cell0-cell-mapping-pmp8r" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.716413 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6597adc7-fdae-4de0-99bc-87d9807f38f4-scripts\") pod \"nova-cell0-cell-mapping-pmp8r\" (UID: \"6597adc7-fdae-4de0-99bc-87d9807f38f4\") " pod="openstack/nova-cell0-cell-mapping-pmp8r" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.730965 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.732452 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.735306 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.770663 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e6284b7-c2bf-491d-a8b8-66390efc3657","Type":"ContainerStarted","Data":"739d82db51200a6aabaa651d7d7ac8ef8cc374993828b81c66981bd68d01b0b1"} Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.773376 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.841262 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6597adc7-fdae-4de0-99bc-87d9807f38f4-config-data\") pod \"nova-cell0-cell-mapping-pmp8r\" (UID: \"6597adc7-fdae-4de0-99bc-87d9807f38f4\") " pod="openstack/nova-cell0-cell-mapping-pmp8r" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.841347 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6597adc7-fdae-4de0-99bc-87d9807f38f4-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pmp8r\" (UID: \"6597adc7-fdae-4de0-99bc-87d9807f38f4\") " pod="openstack/nova-cell0-cell-mapping-pmp8r" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.841470 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6597adc7-fdae-4de0-99bc-87d9807f38f4-scripts\") pod \"nova-cell0-cell-mapping-pmp8r\" (UID: \"6597adc7-fdae-4de0-99bc-87d9807f38f4\") " pod="openstack/nova-cell0-cell-mapping-pmp8r" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.841715 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43796f1c-9838-40a1-9829-f878c2a7f076-config-data\") pod \"nova-scheduler-0\" (UID: \"43796f1c-9838-40a1-9829-f878c2a7f076\") " pod="openstack/nova-scheduler-0" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.841812 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8hvb\" (UniqueName: \"kubernetes.io/projected/6597adc7-fdae-4de0-99bc-87d9807f38f4-kube-api-access-j8hvb\") pod \"nova-cell0-cell-mapping-pmp8r\" (UID: \"6597adc7-fdae-4de0-99bc-87d9807f38f4\") " pod="openstack/nova-cell0-cell-mapping-pmp8r" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.842086 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43796f1c-9838-40a1-9829-f878c2a7f076-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"43796f1c-9838-40a1-9829-f878c2a7f076\") " pod="openstack/nova-scheduler-0" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.842162 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnbnx\" (UniqueName: \"kubernetes.io/projected/43796f1c-9838-40a1-9829-f878c2a7f076-kube-api-access-hnbnx\") pod \"nova-scheduler-0\" (UID: \"43796f1c-9838-40a1-9829-f878c2a7f076\") " pod="openstack/nova-scheduler-0" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.853006 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6597adc7-fdae-4de0-99bc-87d9807f38f4-config-data\") pod \"nova-cell0-cell-mapping-pmp8r\" (UID: \"6597adc7-fdae-4de0-99bc-87d9807f38f4\") " pod="openstack/nova-cell0-cell-mapping-pmp8r" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.856148 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6597adc7-fdae-4de0-99bc-87d9807f38f4-scripts\") pod \"nova-cell0-cell-mapping-pmp8r\" (UID: \"6597adc7-fdae-4de0-99bc-87d9807f38f4\") " pod="openstack/nova-cell0-cell-mapping-pmp8r" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.866847 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6597adc7-fdae-4de0-99bc-87d9807f38f4-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pmp8r\" (UID: \"6597adc7-fdae-4de0-99bc-87d9807f38f4\") " pod="openstack/nova-cell0-cell-mapping-pmp8r" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.886068 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.888456 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.905078 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.917970 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8hvb\" (UniqueName: \"kubernetes.io/projected/6597adc7-fdae-4de0-99bc-87d9807f38f4-kube-api-access-j8hvb\") pod \"nova-cell0-cell-mapping-pmp8r\" (UID: \"6597adc7-fdae-4de0-99bc-87d9807f38f4\") " pod="openstack/nova-cell0-cell-mapping-pmp8r" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.922609 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pmp8r" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.940627 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.944126 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d12412cb-bde4-4c84-bd52-42ac9cb6232c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d12412cb-bde4-4c84-bd52-42ac9cb6232c\") " pod="openstack/nova-metadata-0" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.944194 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43796f1c-9838-40a1-9829-f878c2a7f076-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"43796f1c-9838-40a1-9829-f878c2a7f076\") " pod="openstack/nova-scheduler-0" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.944235 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnbnx\" (UniqueName: \"kubernetes.io/projected/43796f1c-9838-40a1-9829-f878c2a7f076-kube-api-access-hnbnx\") pod \"nova-scheduler-0\" (UID: \"43796f1c-9838-40a1-9829-f878c2a7f076\") " pod="openstack/nova-scheduler-0" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.944262 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d12412cb-bde4-4c84-bd52-42ac9cb6232c-config-data\") pod \"nova-metadata-0\" (UID: \"d12412cb-bde4-4c84-bd52-42ac9cb6232c\") " pod="openstack/nova-metadata-0" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.944323 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pk2j\" (UniqueName: \"kubernetes.io/projected/d12412cb-bde4-4c84-bd52-42ac9cb6232c-kube-api-access-5pk2j\") pod \"nova-metadata-0\" (UID: \"d12412cb-bde4-4c84-bd52-42ac9cb6232c\") " pod="openstack/nova-metadata-0" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.944341 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d12412cb-bde4-4c84-bd52-42ac9cb6232c-logs\") pod \"nova-metadata-0\" (UID: \"d12412cb-bde4-4c84-bd52-42ac9cb6232c\") " pod="openstack/nova-metadata-0" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.944385 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43796f1c-9838-40a1-9829-f878c2a7f076-config-data\") pod \"nova-scheduler-0\" (UID: \"43796f1c-9838-40a1-9829-f878c2a7f076\") " pod="openstack/nova-scheduler-0" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.948803 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43796f1c-9838-40a1-9829-f878c2a7f076-config-data\") pod \"nova-scheduler-0\" (UID: \"43796f1c-9838-40a1-9829-f878c2a7f076\") " pod="openstack/nova-scheduler-0" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.960388 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43796f1c-9838-40a1-9829-f878c2a7f076-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"43796f1c-9838-40a1-9829-f878c2a7f076\") " pod="openstack/nova-scheduler-0" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.974040 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnbnx\" (UniqueName: \"kubernetes.io/projected/43796f1c-9838-40a1-9829-f878c2a7f076-kube-api-access-hnbnx\") pod \"nova-scheduler-0\" (UID: \"43796f1c-9838-40a1-9829-f878c2a7f076\") " pod="openstack/nova-scheduler-0" Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.974684 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:48:34 crc kubenswrapper[4804]: I0217 13:48:34.988858 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.619131917 podStartE2EDuration="4.988841538s" podCreationTimestamp="2026-02-17 13:48:30 +0000 UTC" firstStartedPulling="2026-02-17 13:48:30.854255724 +0000 UTC m=+1384.965675101" lastFinishedPulling="2026-02-17 13:48:34.223965385 +0000 UTC m=+1388.335384722" observedRunningTime="2026-02-17 13:48:34.852558885 +0000 UTC m=+1388.963978222" watchObservedRunningTime="2026-02-17 13:48:34.988841538 +0000 UTC m=+1389.100260875" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.056750 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.057559 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d12412cb-bde4-4c84-bd52-42ac9cb6232c-config-data\") pod \"nova-metadata-0\" (UID: \"d12412cb-bde4-4c84-bd52-42ac9cb6232c\") " pod="openstack/nova-metadata-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.057685 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pk2j\" (UniqueName: \"kubernetes.io/projected/d12412cb-bde4-4c84-bd52-42ac9cb6232c-kube-api-access-5pk2j\") pod \"nova-metadata-0\" (UID: \"d12412cb-bde4-4c84-bd52-42ac9cb6232c\") " pod="openstack/nova-metadata-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.057708 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d12412cb-bde4-4c84-bd52-42ac9cb6232c-logs\") pod \"nova-metadata-0\" (UID: \"d12412cb-bde4-4c84-bd52-42ac9cb6232c\") " pod="openstack/nova-metadata-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.057921 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d12412cb-bde4-4c84-bd52-42ac9cb6232c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d12412cb-bde4-4c84-bd52-42ac9cb6232c\") " pod="openstack/nova-metadata-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.063250 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d12412cb-bde4-4c84-bd52-42ac9cb6232c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d12412cb-bde4-4c84-bd52-42ac9cb6232c\") " pod="openstack/nova-metadata-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.066150 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d12412cb-bde4-4c84-bd52-42ac9cb6232c-config-data\") pod \"nova-metadata-0\" (UID: \"d12412cb-bde4-4c84-bd52-42ac9cb6232c\") " pod="openstack/nova-metadata-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.066825 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.067819 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d12412cb-bde4-4c84-bd52-42ac9cb6232c-logs\") pod \"nova-metadata-0\" (UID: \"d12412cb-bde4-4c84-bd52-42ac9cb6232c\") " pod="openstack/nova-metadata-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.080716 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.086566 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.104261 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.113660 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pk2j\" (UniqueName: \"kubernetes.io/projected/d12412cb-bde4-4c84-bd52-42ac9cb6232c-kube-api-access-5pk2j\") pod \"nova-metadata-0\" (UID: \"d12412cb-bde4-4c84-bd52-42ac9cb6232c\") " pod="openstack/nova-metadata-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.122530 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-nm74r"] Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.134349 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.146309 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-nm74r"] Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.159977 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80e4a011-e72b-4fea-b6cb-15425d5d5940-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"80e4a011-e72b-4fea-b6cb-15425d5d5940\") " pod="openstack/nova-api-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.160380 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gptwq\" (UniqueName: \"kubernetes.io/projected/80e4a011-e72b-4fea-b6cb-15425d5d5940-kube-api-access-gptwq\") pod \"nova-api-0\" (UID: \"80e4a011-e72b-4fea-b6cb-15425d5d5940\") " pod="openstack/nova-api-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.160477 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80e4a011-e72b-4fea-b6cb-15425d5d5940-config-data\") pod \"nova-api-0\" (UID: \"80e4a011-e72b-4fea-b6cb-15425d5d5940\") " pod="openstack/nova-api-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.160548 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80e4a011-e72b-4fea-b6cb-15425d5d5940-logs\") pod \"nova-api-0\" (UID: \"80e4a011-e72b-4fea-b6cb-15425d5d5940\") " pod="openstack/nova-api-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.161001 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.163402 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.168293 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.198180 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.262427 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gptwq\" (UniqueName: \"kubernetes.io/projected/80e4a011-e72b-4fea-b6cb-15425d5d5940-kube-api-access-gptwq\") pod \"nova-api-0\" (UID: \"80e4a011-e72b-4fea-b6cb-15425d5d5940\") " pod="openstack/nova-api-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.262502 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeb819ef-7656-4054-baa2-02efb705872d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"aeb819ef-7656-4054-baa2-02efb705872d\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.262550 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80e4a011-e72b-4fea-b6cb-15425d5d5940-config-data\") pod \"nova-api-0\" (UID: \"80e4a011-e72b-4fea-b6cb-15425d5d5940\") " pod="openstack/nova-api-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.262581 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqn6x\" (UniqueName: \"kubernetes.io/projected/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-kube-api-access-dqn6x\") pod \"dnsmasq-dns-bccf8f775-nm74r\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.262673 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-nm74r\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.262718 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-config\") pod \"dnsmasq-dns-bccf8f775-nm74r\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.262772 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80e4a011-e72b-4fea-b6cb-15425d5d5940-logs\") pod \"nova-api-0\" (UID: \"80e4a011-e72b-4fea-b6cb-15425d5d5940\") " pod="openstack/nova-api-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.262823 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-nm74r\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.262883 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeb819ef-7656-4054-baa2-02efb705872d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"aeb819ef-7656-4054-baa2-02efb705872d\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.263453 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-nm74r\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.263485 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80e4a011-e72b-4fea-b6cb-15425d5d5940-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"80e4a011-e72b-4fea-b6cb-15425d5d5940\") " pod="openstack/nova-api-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.263521 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8n6q\" (UniqueName: \"kubernetes.io/projected/aeb819ef-7656-4054-baa2-02efb705872d-kube-api-access-b8n6q\") pod \"nova-cell1-novncproxy-0\" (UID: \"aeb819ef-7656-4054-baa2-02efb705872d\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.263586 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-dns-svc\") pod \"dnsmasq-dns-bccf8f775-nm74r\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.267964 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80e4a011-e72b-4fea-b6cb-15425d5d5940-logs\") pod \"nova-api-0\" (UID: \"80e4a011-e72b-4fea-b6cb-15425d5d5940\") " pod="openstack/nova-api-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.274744 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80e4a011-e72b-4fea-b6cb-15425d5d5940-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"80e4a011-e72b-4fea-b6cb-15425d5d5940\") " pod="openstack/nova-api-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.275174 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.277431 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80e4a011-e72b-4fea-b6cb-15425d5d5940-config-data\") pod \"nova-api-0\" (UID: \"80e4a011-e72b-4fea-b6cb-15425d5d5940\") " pod="openstack/nova-api-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.289778 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gptwq\" (UniqueName: \"kubernetes.io/projected/80e4a011-e72b-4fea-b6cb-15425d5d5940-kube-api-access-gptwq\") pod \"nova-api-0\" (UID: \"80e4a011-e72b-4fea-b6cb-15425d5d5940\") " pod="openstack/nova-api-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.366535 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqn6x\" (UniqueName: \"kubernetes.io/projected/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-kube-api-access-dqn6x\") pod \"dnsmasq-dns-bccf8f775-nm74r\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.366597 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-nm74r\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.366638 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-config\") pod \"dnsmasq-dns-bccf8f775-nm74r\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.366679 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-nm74r\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.366716 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeb819ef-7656-4054-baa2-02efb705872d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"aeb819ef-7656-4054-baa2-02efb705872d\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.366762 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-nm74r\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.366804 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8n6q\" (UniqueName: \"kubernetes.io/projected/aeb819ef-7656-4054-baa2-02efb705872d-kube-api-access-b8n6q\") pod \"nova-cell1-novncproxy-0\" (UID: \"aeb819ef-7656-4054-baa2-02efb705872d\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.366840 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-dns-svc\") pod \"dnsmasq-dns-bccf8f775-nm74r\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.366994 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeb819ef-7656-4054-baa2-02efb705872d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"aeb819ef-7656-4054-baa2-02efb705872d\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.368813 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-nm74r\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.369181 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-nm74r\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.369745 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-dns-svc\") pod \"dnsmasq-dns-bccf8f775-nm74r\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.370112 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-nm74r\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.370358 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-config\") pod \"dnsmasq-dns-bccf8f775-nm74r\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.378183 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeb819ef-7656-4054-baa2-02efb705872d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"aeb819ef-7656-4054-baa2-02efb705872d\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.386830 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeb819ef-7656-4054-baa2-02efb705872d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"aeb819ef-7656-4054-baa2-02efb705872d\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.393024 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqn6x\" (UniqueName: \"kubernetes.io/projected/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-kube-api-access-dqn6x\") pod \"dnsmasq-dns-bccf8f775-nm74r\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.388288 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8n6q\" (UniqueName: \"kubernetes.io/projected/aeb819ef-7656-4054-baa2-02efb705872d-kube-api-access-b8n6q\") pod \"nova-cell1-novncproxy-0\" (UID: \"aeb819ef-7656-4054-baa2-02efb705872d\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.440688 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.474674 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.532525 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.720714 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-pmp8r"] Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.741303 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.830559 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.836536 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pmp8r" event={"ID":"6597adc7-fdae-4de0-99bc-87d9807f38f4","Type":"ContainerStarted","Data":"b65e726ad0fd58fbc98c718204a9a6619e848272b9dfc249b9a1897ff310c04a"} Feb 17 13:48:35 crc kubenswrapper[4804]: I0217 13:48:35.839502 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"43796f1c-9838-40a1-9829-f878c2a7f076","Type":"ContainerStarted","Data":"76fb016395e0231c3d8a7ae1865fe7cb74985c931d1b06a2d2e7c2491c7f5dbe"} Feb 17 13:48:35 crc kubenswrapper[4804]: W0217 13:48:35.849542 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd12412cb_bde4_4c84_bd52_42ac9cb6232c.slice/crio-4e7bd3339c4a0a6d3bb2de47fcc59da6af0ee83f739c986543db43b9bac674f4 WatchSource:0}: Error finding container 4e7bd3339c4a0a6d3bb2de47fcc59da6af0ee83f739c986543db43b9bac674f4: Status 404 returned error can't find the container with id 4e7bd3339c4a0a6d3bb2de47fcc59da6af0ee83f739c986543db43b9bac674f4 Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.258072 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-nm74r"] Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.339885 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.363349 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 13:48:36 crc kubenswrapper[4804]: W0217 13:48:36.371325 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80e4a011_e72b_4fea_b6cb_15425d5d5940.slice/crio-f4250a31bbaa8fb75b13d25a5ac1d90d53745e9ab78a95f6bcc9ab9e9e16fb06 WatchSource:0}: Error finding container f4250a31bbaa8fb75b13d25a5ac1d90d53745e9ab78a95f6bcc9ab9e9e16fb06: Status 404 returned error can't find the container with id f4250a31bbaa8fb75b13d25a5ac1d90d53745e9ab78a95f6bcc9ab9e9e16fb06 Feb 17 13:48:36 crc kubenswrapper[4804]: W0217 13:48:36.384623 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaeb819ef_7656_4054_baa2_02efb705872d.slice/crio-4ffce2ca7928c17f5cf87a7c53dec619e957c317bf58abe39325d5edeb55c199 WatchSource:0}: Error finding container 4ffce2ca7928c17f5cf87a7c53dec619e957c317bf58abe39325d5edeb55c199: Status 404 returned error can't find the container with id 4ffce2ca7928c17f5cf87a7c53dec619e957c317bf58abe39325d5edeb55c199 Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.412987 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wq5kj"] Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.414228 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wq5kj" Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.417120 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.417397 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.438734 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wq5kj"] Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.498679 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c11e165e-2605-470a-a865-230b274ce8d3-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wq5kj\" (UID: \"c11e165e-2605-470a-a865-230b274ce8d3\") " pod="openstack/nova-cell1-conductor-db-sync-wq5kj" Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.498811 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c11e165e-2605-470a-a865-230b274ce8d3-scripts\") pod \"nova-cell1-conductor-db-sync-wq5kj\" (UID: \"c11e165e-2605-470a-a865-230b274ce8d3\") " pod="openstack/nova-cell1-conductor-db-sync-wq5kj" Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.498864 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c11e165e-2605-470a-a865-230b274ce8d3-config-data\") pod \"nova-cell1-conductor-db-sync-wq5kj\" (UID: \"c11e165e-2605-470a-a865-230b274ce8d3\") " pod="openstack/nova-cell1-conductor-db-sync-wq5kj" Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.498914 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smxf4\" (UniqueName: \"kubernetes.io/projected/c11e165e-2605-470a-a865-230b274ce8d3-kube-api-access-smxf4\") pod \"nova-cell1-conductor-db-sync-wq5kj\" (UID: \"c11e165e-2605-470a-a865-230b274ce8d3\") " pod="openstack/nova-cell1-conductor-db-sync-wq5kj" Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.599999 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c11e165e-2605-470a-a865-230b274ce8d3-scripts\") pod \"nova-cell1-conductor-db-sync-wq5kj\" (UID: \"c11e165e-2605-470a-a865-230b274ce8d3\") " pod="openstack/nova-cell1-conductor-db-sync-wq5kj" Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.600057 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c11e165e-2605-470a-a865-230b274ce8d3-config-data\") pod \"nova-cell1-conductor-db-sync-wq5kj\" (UID: \"c11e165e-2605-470a-a865-230b274ce8d3\") " pod="openstack/nova-cell1-conductor-db-sync-wq5kj" Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.600098 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smxf4\" (UniqueName: \"kubernetes.io/projected/c11e165e-2605-470a-a865-230b274ce8d3-kube-api-access-smxf4\") pod \"nova-cell1-conductor-db-sync-wq5kj\" (UID: \"c11e165e-2605-470a-a865-230b274ce8d3\") " pod="openstack/nova-cell1-conductor-db-sync-wq5kj" Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.600158 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c11e165e-2605-470a-a865-230b274ce8d3-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wq5kj\" (UID: \"c11e165e-2605-470a-a865-230b274ce8d3\") " pod="openstack/nova-cell1-conductor-db-sync-wq5kj" Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.606938 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c11e165e-2605-470a-a865-230b274ce8d3-scripts\") pod \"nova-cell1-conductor-db-sync-wq5kj\" (UID: \"c11e165e-2605-470a-a865-230b274ce8d3\") " pod="openstack/nova-cell1-conductor-db-sync-wq5kj" Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.636147 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c11e165e-2605-470a-a865-230b274ce8d3-config-data\") pod \"nova-cell1-conductor-db-sync-wq5kj\" (UID: \"c11e165e-2605-470a-a865-230b274ce8d3\") " pod="openstack/nova-cell1-conductor-db-sync-wq5kj" Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.640255 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smxf4\" (UniqueName: \"kubernetes.io/projected/c11e165e-2605-470a-a865-230b274ce8d3-kube-api-access-smxf4\") pod \"nova-cell1-conductor-db-sync-wq5kj\" (UID: \"c11e165e-2605-470a-a865-230b274ce8d3\") " pod="openstack/nova-cell1-conductor-db-sync-wq5kj" Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.645693 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c11e165e-2605-470a-a865-230b274ce8d3-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wq5kj\" (UID: \"c11e165e-2605-470a-a865-230b274ce8d3\") " pod="openstack/nova-cell1-conductor-db-sync-wq5kj" Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.873877 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pmp8r" event={"ID":"6597adc7-fdae-4de0-99bc-87d9807f38f4","Type":"ContainerStarted","Data":"29efb5e0a9decba15d04c2ad76b8438da8424bb8f92bf46c981df4cb056e18f6"} Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.879494 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"80e4a011-e72b-4fea-b6cb-15425d5d5940","Type":"ContainerStarted","Data":"f4250a31bbaa8fb75b13d25a5ac1d90d53745e9ab78a95f6bcc9ab9e9e16fb06"} Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.880987 4804 generic.go:334] "Generic (PLEG): container finished" podID="59bcfea7-54c7-4d99-afa8-e48fd8d7ee70" containerID="11ccb1e04351ba07a549d61417358dec2218d53f646147b72efa0ee2b3fc4654" exitCode=0 Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.881075 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-nm74r" event={"ID":"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70","Type":"ContainerDied","Data":"11ccb1e04351ba07a549d61417358dec2218d53f646147b72efa0ee2b3fc4654"} Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.881097 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-nm74r" event={"ID":"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70","Type":"ContainerStarted","Data":"9ac5534fef55ed02d86af4d8912cb72f23f77c2e384ce39f866abb0e39f803e5"} Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.882402 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wq5kj" Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.886611 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"aeb819ef-7656-4054-baa2-02efb705872d","Type":"ContainerStarted","Data":"4ffce2ca7928c17f5cf87a7c53dec619e957c317bf58abe39325d5edeb55c199"} Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.890340 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d12412cb-bde4-4c84-bd52-42ac9cb6232c","Type":"ContainerStarted","Data":"4e7bd3339c4a0a6d3bb2de47fcc59da6af0ee83f739c986543db43b9bac674f4"} Feb 17 13:48:36 crc kubenswrapper[4804]: I0217 13:48:36.902087 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-pmp8r" podStartSLOduration=2.902068878 podStartE2EDuration="2.902068878s" podCreationTimestamp="2026-02-17 13:48:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:48:36.895499222 +0000 UTC m=+1391.006918559" watchObservedRunningTime="2026-02-17 13:48:36.902068878 +0000 UTC m=+1391.013488215" Feb 17 13:48:37 crc kubenswrapper[4804]: I0217 13:48:37.497873 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wq5kj"] Feb 17 13:48:37 crc kubenswrapper[4804]: I0217 13:48:37.913830 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-nm74r" event={"ID":"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70","Type":"ContainerStarted","Data":"80c53b35d48d085e0876f20fe6ad287ecfaf18dbb36d6aa286cf573185b619c2"} Feb 17 13:48:37 crc kubenswrapper[4804]: I0217 13:48:37.914543 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:37 crc kubenswrapper[4804]: I0217 13:48:37.927913 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wq5kj" event={"ID":"c11e165e-2605-470a-a865-230b274ce8d3","Type":"ContainerStarted","Data":"e2acfd4d07f2a376a865d19f0462c02776c9702874f6443998a2d4c2b54946eb"} Feb 17 13:48:37 crc kubenswrapper[4804]: I0217 13:48:37.946665 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bccf8f775-nm74r" podStartSLOduration=3.946644933 podStartE2EDuration="3.946644933s" podCreationTimestamp="2026-02-17 13:48:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:48:37.939693004 +0000 UTC m=+1392.051112341" watchObservedRunningTime="2026-02-17 13:48:37.946644933 +0000 UTC m=+1392.058064270" Feb 17 13:48:38 crc kubenswrapper[4804]: I0217 13:48:38.827377 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 13:48:38 crc kubenswrapper[4804]: I0217 13:48:38.845131 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:48:40 crc kubenswrapper[4804]: I0217 13:48:40.972324 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"aeb819ef-7656-4054-baa2-02efb705872d","Type":"ContainerStarted","Data":"12eb25fbc580c4805b1380337dedc13f277483de04cb357d0874c00b946b6e87"} Feb 17 13:48:40 crc kubenswrapper[4804]: I0217 13:48:40.972785 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="aeb819ef-7656-4054-baa2-02efb705872d" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://12eb25fbc580c4805b1380337dedc13f277483de04cb357d0874c00b946b6e87" gracePeriod=30 Feb 17 13:48:40 crc kubenswrapper[4804]: I0217 13:48:40.994727 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.7469515329999998 podStartE2EDuration="5.994704775s" podCreationTimestamp="2026-02-17 13:48:35 +0000 UTC" firstStartedPulling="2026-02-17 13:48:36.387905206 +0000 UTC m=+1390.499324553" lastFinishedPulling="2026-02-17 13:48:40.635658458 +0000 UTC m=+1394.747077795" observedRunningTime="2026-02-17 13:48:40.991157893 +0000 UTC m=+1395.102577240" watchObservedRunningTime="2026-02-17 13:48:40.994704775 +0000 UTC m=+1395.106124112" Feb 17 13:48:41 crc kubenswrapper[4804]: I0217 13:48:41.986945 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d12412cb-bde4-4c84-bd52-42ac9cb6232c","Type":"ContainerStarted","Data":"68f3226d2e0ed265aeb8694c0d6911c16833cce4583b30d1aa0e4da3be6b440d"} Feb 17 13:48:41 crc kubenswrapper[4804]: I0217 13:48:41.987323 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d12412cb-bde4-4c84-bd52-42ac9cb6232c","Type":"ContainerStarted","Data":"4c7bebae4d340d8939f70598915f6081d794081c1be4ac59380d4859e656009c"} Feb 17 13:48:41 crc kubenswrapper[4804]: I0217 13:48:41.987047 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d12412cb-bde4-4c84-bd52-42ac9cb6232c" containerName="nova-metadata-log" containerID="cri-o://4c7bebae4d340d8939f70598915f6081d794081c1be4ac59380d4859e656009c" gracePeriod=30 Feb 17 13:48:41 crc kubenswrapper[4804]: I0217 13:48:41.987353 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d12412cb-bde4-4c84-bd52-42ac9cb6232c" containerName="nova-metadata-metadata" containerID="cri-o://68f3226d2e0ed265aeb8694c0d6911c16833cce4583b30d1aa0e4da3be6b440d" gracePeriod=30 Feb 17 13:48:41 crc kubenswrapper[4804]: I0217 13:48:41.996004 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"80e4a011-e72b-4fea-b6cb-15425d5d5940","Type":"ContainerStarted","Data":"43a7ab1b51a9771eea24f4ca58195ac019c4d4f576bf8a9167479dd864686f01"} Feb 17 13:48:41 crc kubenswrapper[4804]: I0217 13:48:41.996050 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"80e4a011-e72b-4fea-b6cb-15425d5d5940","Type":"ContainerStarted","Data":"23b101efea3a1f5b40b2a37bc5a802b2bc03e09dfe00e3188317dd5155cb9d8f"} Feb 17 13:48:42 crc kubenswrapper[4804]: I0217 13:48:42.001015 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"43796f1c-9838-40a1-9829-f878c2a7f076","Type":"ContainerStarted","Data":"def8cc35b8a0527a15ca9c8261b86f48b99c8454d4a98322ec91fe520a6a815f"} Feb 17 13:48:42 crc kubenswrapper[4804]: I0217 13:48:42.005575 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wq5kj" event={"ID":"c11e165e-2605-470a-a865-230b274ce8d3","Type":"ContainerStarted","Data":"24aef71ff922a8ddea4d7c3429161120ea76c5281b5a5f51b9b913d40e9cb137"} Feb 17 13:48:42 crc kubenswrapper[4804]: I0217 13:48:42.017915 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.283978363 podStartE2EDuration="8.017896947s" podCreationTimestamp="2026-02-17 13:48:34 +0000 UTC" firstStartedPulling="2026-02-17 13:48:35.873859688 +0000 UTC m=+1389.985279025" lastFinishedPulling="2026-02-17 13:48:40.607778272 +0000 UTC m=+1394.719197609" observedRunningTime="2026-02-17 13:48:42.012963592 +0000 UTC m=+1396.124382929" watchObservedRunningTime="2026-02-17 13:48:42.017896947 +0000 UTC m=+1396.129316284" Feb 17 13:48:42 crc kubenswrapper[4804]: I0217 13:48:42.038555 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-wq5kj" podStartSLOduration=6.038535136 podStartE2EDuration="6.038535136s" podCreationTimestamp="2026-02-17 13:48:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:48:42.027766317 +0000 UTC m=+1396.139185654" watchObservedRunningTime="2026-02-17 13:48:42.038535136 +0000 UTC m=+1396.149954493" Feb 17 13:48:42 crc kubenswrapper[4804]: I0217 13:48:42.056015 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.827953354 podStartE2EDuration="8.055997595s" podCreationTimestamp="2026-02-17 13:48:34 +0000 UTC" firstStartedPulling="2026-02-17 13:48:36.378587524 +0000 UTC m=+1390.490006861" lastFinishedPulling="2026-02-17 13:48:40.606631765 +0000 UTC m=+1394.718051102" observedRunningTime="2026-02-17 13:48:42.050718149 +0000 UTC m=+1396.162137496" watchObservedRunningTime="2026-02-17 13:48:42.055997595 +0000 UTC m=+1396.167416932" Feb 17 13:48:42 crc kubenswrapper[4804]: I0217 13:48:42.071632 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.236333265 podStartE2EDuration="8.071610295s" podCreationTimestamp="2026-02-17 13:48:34 +0000 UTC" firstStartedPulling="2026-02-17 13:48:35.771357196 +0000 UTC m=+1389.882776533" lastFinishedPulling="2026-02-17 13:48:40.606634226 +0000 UTC m=+1394.718053563" observedRunningTime="2026-02-17 13:48:42.062715176 +0000 UTC m=+1396.174134503" watchObservedRunningTime="2026-02-17 13:48:42.071610295 +0000 UTC m=+1396.183029632" Feb 17 13:48:42 crc kubenswrapper[4804]: I0217 13:48:42.560968 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 13:48:42 crc kubenswrapper[4804]: I0217 13:48:42.672667 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d12412cb-bde4-4c84-bd52-42ac9cb6232c-config-data\") pod \"d12412cb-bde4-4c84-bd52-42ac9cb6232c\" (UID: \"d12412cb-bde4-4c84-bd52-42ac9cb6232c\") " Feb 17 13:48:42 crc kubenswrapper[4804]: I0217 13:48:42.672969 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d12412cb-bde4-4c84-bd52-42ac9cb6232c-logs\") pod \"d12412cb-bde4-4c84-bd52-42ac9cb6232c\" (UID: \"d12412cb-bde4-4c84-bd52-42ac9cb6232c\") " Feb 17 13:48:42 crc kubenswrapper[4804]: I0217 13:48:42.672999 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pk2j\" (UniqueName: \"kubernetes.io/projected/d12412cb-bde4-4c84-bd52-42ac9cb6232c-kube-api-access-5pk2j\") pod \"d12412cb-bde4-4c84-bd52-42ac9cb6232c\" (UID: \"d12412cb-bde4-4c84-bd52-42ac9cb6232c\") " Feb 17 13:48:42 crc kubenswrapper[4804]: I0217 13:48:42.673018 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d12412cb-bde4-4c84-bd52-42ac9cb6232c-combined-ca-bundle\") pod \"d12412cb-bde4-4c84-bd52-42ac9cb6232c\" (UID: \"d12412cb-bde4-4c84-bd52-42ac9cb6232c\") " Feb 17 13:48:42 crc kubenswrapper[4804]: I0217 13:48:42.676743 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d12412cb-bde4-4c84-bd52-42ac9cb6232c-logs" (OuterVolumeSpecName: "logs") pod "d12412cb-bde4-4c84-bd52-42ac9cb6232c" (UID: "d12412cb-bde4-4c84-bd52-42ac9cb6232c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:48:42 crc kubenswrapper[4804]: I0217 13:48:42.679416 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d12412cb-bde4-4c84-bd52-42ac9cb6232c-kube-api-access-5pk2j" (OuterVolumeSpecName: "kube-api-access-5pk2j") pod "d12412cb-bde4-4c84-bd52-42ac9cb6232c" (UID: "d12412cb-bde4-4c84-bd52-42ac9cb6232c"). InnerVolumeSpecName "kube-api-access-5pk2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:48:42 crc kubenswrapper[4804]: I0217 13:48:42.704529 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d12412cb-bde4-4c84-bd52-42ac9cb6232c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d12412cb-bde4-4c84-bd52-42ac9cb6232c" (UID: "d12412cb-bde4-4c84-bd52-42ac9cb6232c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:42 crc kubenswrapper[4804]: I0217 13:48:42.708623 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d12412cb-bde4-4c84-bd52-42ac9cb6232c-config-data" (OuterVolumeSpecName: "config-data") pod "d12412cb-bde4-4c84-bd52-42ac9cb6232c" (UID: "d12412cb-bde4-4c84-bd52-42ac9cb6232c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:42 crc kubenswrapper[4804]: I0217 13:48:42.774869 4804 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d12412cb-bde4-4c84-bd52-42ac9cb6232c-logs\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:42 crc kubenswrapper[4804]: I0217 13:48:42.775184 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pk2j\" (UniqueName: \"kubernetes.io/projected/d12412cb-bde4-4c84-bd52-42ac9cb6232c-kube-api-access-5pk2j\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:42 crc kubenswrapper[4804]: I0217 13:48:42.775218 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d12412cb-bde4-4c84-bd52-42ac9cb6232c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:42 crc kubenswrapper[4804]: I0217 13:48:42.775232 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d12412cb-bde4-4c84-bd52-42ac9cb6232c-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.014545 4804 generic.go:334] "Generic (PLEG): container finished" podID="d12412cb-bde4-4c84-bd52-42ac9cb6232c" containerID="68f3226d2e0ed265aeb8694c0d6911c16833cce4583b30d1aa0e4da3be6b440d" exitCode=0 Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.014575 4804 generic.go:334] "Generic (PLEG): container finished" podID="d12412cb-bde4-4c84-bd52-42ac9cb6232c" containerID="4c7bebae4d340d8939f70598915f6081d794081c1be4ac59380d4859e656009c" exitCode=143 Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.015425 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.018017 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d12412cb-bde4-4c84-bd52-42ac9cb6232c","Type":"ContainerDied","Data":"68f3226d2e0ed265aeb8694c0d6911c16833cce4583b30d1aa0e4da3be6b440d"} Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.018052 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d12412cb-bde4-4c84-bd52-42ac9cb6232c","Type":"ContainerDied","Data":"4c7bebae4d340d8939f70598915f6081d794081c1be4ac59380d4859e656009c"} Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.018063 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d12412cb-bde4-4c84-bd52-42ac9cb6232c","Type":"ContainerDied","Data":"4e7bd3339c4a0a6d3bb2de47fcc59da6af0ee83f739c986543db43b9bac674f4"} Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.018077 4804 scope.go:117] "RemoveContainer" containerID="68f3226d2e0ed265aeb8694c0d6911c16833cce4583b30d1aa0e4da3be6b440d" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.067919 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.074099 4804 scope.go:117] "RemoveContainer" containerID="4c7bebae4d340d8939f70598915f6081d794081c1be4ac59380d4859e656009c" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.092891 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.112063 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:48:43 crc kubenswrapper[4804]: E0217 13:48:43.112457 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d12412cb-bde4-4c84-bd52-42ac9cb6232c" containerName="nova-metadata-metadata" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.112473 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="d12412cb-bde4-4c84-bd52-42ac9cb6232c" containerName="nova-metadata-metadata" Feb 17 13:48:43 crc kubenswrapper[4804]: E0217 13:48:43.112519 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d12412cb-bde4-4c84-bd52-42ac9cb6232c" containerName="nova-metadata-log" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.112525 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="d12412cb-bde4-4c84-bd52-42ac9cb6232c" containerName="nova-metadata-log" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.112726 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="d12412cb-bde4-4c84-bd52-42ac9cb6232c" containerName="nova-metadata-log" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.112752 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="d12412cb-bde4-4c84-bd52-42ac9cb6232c" containerName="nova-metadata-metadata" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.114055 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.126693 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.129888 4804 scope.go:117] "RemoveContainer" containerID="68f3226d2e0ed265aeb8694c0d6911c16833cce4583b30d1aa0e4da3be6b440d" Feb 17 13:48:43 crc kubenswrapper[4804]: E0217 13:48:43.138418 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68f3226d2e0ed265aeb8694c0d6911c16833cce4583b30d1aa0e4da3be6b440d\": container with ID starting with 68f3226d2e0ed265aeb8694c0d6911c16833cce4583b30d1aa0e4da3be6b440d not found: ID does not exist" containerID="68f3226d2e0ed265aeb8694c0d6911c16833cce4583b30d1aa0e4da3be6b440d" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.138479 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68f3226d2e0ed265aeb8694c0d6911c16833cce4583b30d1aa0e4da3be6b440d"} err="failed to get container status \"68f3226d2e0ed265aeb8694c0d6911c16833cce4583b30d1aa0e4da3be6b440d\": rpc error: code = NotFound desc = could not find container \"68f3226d2e0ed265aeb8694c0d6911c16833cce4583b30d1aa0e4da3be6b440d\": container with ID starting with 68f3226d2e0ed265aeb8694c0d6911c16833cce4583b30d1aa0e4da3be6b440d not found: ID does not exist" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.138510 4804 scope.go:117] "RemoveContainer" containerID="4c7bebae4d340d8939f70598915f6081d794081c1be4ac59380d4859e656009c" Feb 17 13:48:43 crc kubenswrapper[4804]: E0217 13:48:43.139880 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c7bebae4d340d8939f70598915f6081d794081c1be4ac59380d4859e656009c\": container with ID starting with 4c7bebae4d340d8939f70598915f6081d794081c1be4ac59380d4859e656009c not found: ID does not exist" containerID="4c7bebae4d340d8939f70598915f6081d794081c1be4ac59380d4859e656009c" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.139921 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c7bebae4d340d8939f70598915f6081d794081c1be4ac59380d4859e656009c"} err="failed to get container status \"4c7bebae4d340d8939f70598915f6081d794081c1be4ac59380d4859e656009c\": rpc error: code = NotFound desc = could not find container \"4c7bebae4d340d8939f70598915f6081d794081c1be4ac59380d4859e656009c\": container with ID starting with 4c7bebae4d340d8939f70598915f6081d794081c1be4ac59380d4859e656009c not found: ID does not exist" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.139951 4804 scope.go:117] "RemoveContainer" containerID="68f3226d2e0ed265aeb8694c0d6911c16833cce4583b30d1aa0e4da3be6b440d" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.140187 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68f3226d2e0ed265aeb8694c0d6911c16833cce4583b30d1aa0e4da3be6b440d"} err="failed to get container status \"68f3226d2e0ed265aeb8694c0d6911c16833cce4583b30d1aa0e4da3be6b440d\": rpc error: code = NotFound desc = could not find container \"68f3226d2e0ed265aeb8694c0d6911c16833cce4583b30d1aa0e4da3be6b440d\": container with ID starting with 68f3226d2e0ed265aeb8694c0d6911c16833cce4583b30d1aa0e4da3be6b440d not found: ID does not exist" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.140226 4804 scope.go:117] "RemoveContainer" containerID="4c7bebae4d340d8939f70598915f6081d794081c1be4ac59380d4859e656009c" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.140423 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c7bebae4d340d8939f70598915f6081d794081c1be4ac59380d4859e656009c"} err="failed to get container status \"4c7bebae4d340d8939f70598915f6081d794081c1be4ac59380d4859e656009c\": rpc error: code = NotFound desc = could not find container \"4c7bebae4d340d8939f70598915f6081d794081c1be4ac59380d4859e656009c\": container with ID starting with 4c7bebae4d340d8939f70598915f6081d794081c1be4ac59380d4859e656009c not found: ID does not exist" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.154860 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.182438 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.213104 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgqgh\" (UniqueName: \"kubernetes.io/projected/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-kube-api-access-fgqgh\") pod \"nova-metadata-0\" (UID: \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\") " pod="openstack/nova-metadata-0" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.213224 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-config-data\") pod \"nova-metadata-0\" (UID: \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\") " pod="openstack/nova-metadata-0" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.213248 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-logs\") pod \"nova-metadata-0\" (UID: \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\") " pod="openstack/nova-metadata-0" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.213270 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\") " pod="openstack/nova-metadata-0" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.213357 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\") " pod="openstack/nova-metadata-0" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.315737 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-config-data\") pod \"nova-metadata-0\" (UID: \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\") " pod="openstack/nova-metadata-0" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.316004 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-logs\") pod \"nova-metadata-0\" (UID: \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\") " pod="openstack/nova-metadata-0" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.316029 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\") " pod="openstack/nova-metadata-0" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.316103 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\") " pod="openstack/nova-metadata-0" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.316246 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgqgh\" (UniqueName: \"kubernetes.io/projected/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-kube-api-access-fgqgh\") pod \"nova-metadata-0\" (UID: \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\") " pod="openstack/nova-metadata-0" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.317156 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-logs\") pod \"nova-metadata-0\" (UID: \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\") " pod="openstack/nova-metadata-0" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.320613 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-config-data\") pod \"nova-metadata-0\" (UID: \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\") " pod="openstack/nova-metadata-0" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.321528 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\") " pod="openstack/nova-metadata-0" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.334090 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgqgh\" (UniqueName: \"kubernetes.io/projected/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-kube-api-access-fgqgh\") pod \"nova-metadata-0\" (UID: \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\") " pod="openstack/nova-metadata-0" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.334256 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\") " pod="openstack/nova-metadata-0" Feb 17 13:48:43 crc kubenswrapper[4804]: I0217 13:48:43.477980 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 13:48:44 crc kubenswrapper[4804]: I0217 13:48:44.586662 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d12412cb-bde4-4c84-bd52-42ac9cb6232c" path="/var/lib/kubelet/pods/d12412cb-bde4-4c84-bd52-42ac9cb6232c/volumes" Feb 17 13:48:44 crc kubenswrapper[4804]: I0217 13:48:44.672620 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:48:45 crc kubenswrapper[4804]: I0217 13:48:45.081684 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 17 13:48:45 crc kubenswrapper[4804]: I0217 13:48:45.082031 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 17 13:48:45 crc kubenswrapper[4804]: I0217 13:48:45.128238 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 17 13:48:45 crc kubenswrapper[4804]: I0217 13:48:45.266854 4804 generic.go:334] "Generic (PLEG): container finished" podID="6597adc7-fdae-4de0-99bc-87d9807f38f4" containerID="29efb5e0a9decba15d04c2ad76b8438da8424bb8f92bf46c981df4cb056e18f6" exitCode=0 Feb 17 13:48:45 crc kubenswrapper[4804]: I0217 13:48:45.266960 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pmp8r" event={"ID":"6597adc7-fdae-4de0-99bc-87d9807f38f4","Type":"ContainerDied","Data":"29efb5e0a9decba15d04c2ad76b8438da8424bb8f92bf46c981df4cb056e18f6"} Feb 17 13:48:45 crc kubenswrapper[4804]: I0217 13:48:45.271109 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21","Type":"ContainerStarted","Data":"1b656928bc960a912a0d365195455c69ef491a1d2cf6faba1db06e6aca0a539c"} Feb 17 13:48:45 crc kubenswrapper[4804]: I0217 13:48:45.271289 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21","Type":"ContainerStarted","Data":"316bd5b973dc2ed5f32629f896a12fc942acb61c9a942863b5059f67e620eb3d"} Feb 17 13:48:45 crc kubenswrapper[4804]: I0217 13:48:45.271370 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21","Type":"ContainerStarted","Data":"981aac18a07ba416bc920d67be4b035029798cd9a4dd9ffa83de28fefc1ded2e"} Feb 17 13:48:45 crc kubenswrapper[4804]: I0217 13:48:45.310676 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.31065631 podStartE2EDuration="2.31065631s" podCreationTimestamp="2026-02-17 13:48:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:48:45.307529961 +0000 UTC m=+1399.418949318" watchObservedRunningTime="2026-02-17 13:48:45.31065631 +0000 UTC m=+1399.422075647" Feb 17 13:48:45 crc kubenswrapper[4804]: I0217 13:48:45.313175 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 17 13:48:45 crc kubenswrapper[4804]: I0217 13:48:45.442401 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 13:48:45 crc kubenswrapper[4804]: I0217 13:48:45.442460 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 13:48:45 crc kubenswrapper[4804]: I0217 13:48:45.477257 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:48:45 crc kubenswrapper[4804]: I0217 13:48:45.539349 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:48:45 crc kubenswrapper[4804]: I0217 13:48:45.547343 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-mtrxj"] Feb 17 13:48:45 crc kubenswrapper[4804]: I0217 13:48:45.547823 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" podUID="737ac1d8-ad22-4a56-b203-eb2212949fb6" containerName="dnsmasq-dns" containerID="cri-o://be3cfd9284bc924e079e815dce6ec8556a04ca7bbbe9c4fae1b1eb972152c0b7" gracePeriod=10 Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.097566 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.277445 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9ztf\" (UniqueName: \"kubernetes.io/projected/737ac1d8-ad22-4a56-b203-eb2212949fb6-kube-api-access-x9ztf\") pod \"737ac1d8-ad22-4a56-b203-eb2212949fb6\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.277591 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-dns-swift-storage-0\") pod \"737ac1d8-ad22-4a56-b203-eb2212949fb6\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.277640 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-config\") pod \"737ac1d8-ad22-4a56-b203-eb2212949fb6\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.277667 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-ovsdbserver-nb\") pod \"737ac1d8-ad22-4a56-b203-eb2212949fb6\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.277751 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-ovsdbserver-sb\") pod \"737ac1d8-ad22-4a56-b203-eb2212949fb6\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.277781 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-dns-svc\") pod \"737ac1d8-ad22-4a56-b203-eb2212949fb6\" (UID: \"737ac1d8-ad22-4a56-b203-eb2212949fb6\") " Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.285288 4804 generic.go:334] "Generic (PLEG): container finished" podID="737ac1d8-ad22-4a56-b203-eb2212949fb6" containerID="be3cfd9284bc924e079e815dce6ec8556a04ca7bbbe9c4fae1b1eb972152c0b7" exitCode=0 Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.285404 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.285416 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" event={"ID":"737ac1d8-ad22-4a56-b203-eb2212949fb6","Type":"ContainerDied","Data":"be3cfd9284bc924e079e815dce6ec8556a04ca7bbbe9c4fae1b1eb972152c0b7"} Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.285496 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-mtrxj" event={"ID":"737ac1d8-ad22-4a56-b203-eb2212949fb6","Type":"ContainerDied","Data":"2d996d992d2a3254b879bd96b12e636e65525644b7181f7f3f61897c257c69b0"} Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.285521 4804 scope.go:117] "RemoveContainer" containerID="be3cfd9284bc924e079e815dce6ec8556a04ca7bbbe9c4fae1b1eb972152c0b7" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.297556 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/737ac1d8-ad22-4a56-b203-eb2212949fb6-kube-api-access-x9ztf" (OuterVolumeSpecName: "kube-api-access-x9ztf") pod "737ac1d8-ad22-4a56-b203-eb2212949fb6" (UID: "737ac1d8-ad22-4a56-b203-eb2212949fb6"). InnerVolumeSpecName "kube-api-access-x9ztf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.355938 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "737ac1d8-ad22-4a56-b203-eb2212949fb6" (UID: "737ac1d8-ad22-4a56-b203-eb2212949fb6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.360637 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "737ac1d8-ad22-4a56-b203-eb2212949fb6" (UID: "737ac1d8-ad22-4a56-b203-eb2212949fb6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.365675 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-config" (OuterVolumeSpecName: "config") pod "737ac1d8-ad22-4a56-b203-eb2212949fb6" (UID: "737ac1d8-ad22-4a56-b203-eb2212949fb6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.379829 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "737ac1d8-ad22-4a56-b203-eb2212949fb6" (UID: "737ac1d8-ad22-4a56-b203-eb2212949fb6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.382283 4804 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.382318 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.382332 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.382346 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.382358 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9ztf\" (UniqueName: \"kubernetes.io/projected/737ac1d8-ad22-4a56-b203-eb2212949fb6-kube-api-access-x9ztf\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.410775 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "737ac1d8-ad22-4a56-b203-eb2212949fb6" (UID: "737ac1d8-ad22-4a56-b203-eb2212949fb6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.484390 4804 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/737ac1d8-ad22-4a56-b203-eb2212949fb6-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.488442 4804 scope.go:117] "RemoveContainer" containerID="694f46d3a17de98217359c48adef8a435392ec51f1c7919012624d49fd6b0165" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.507702 4804 scope.go:117] "RemoveContainer" containerID="be3cfd9284bc924e079e815dce6ec8556a04ca7bbbe9c4fae1b1eb972152c0b7" Feb 17 13:48:46 crc kubenswrapper[4804]: E0217 13:48:46.512171 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be3cfd9284bc924e079e815dce6ec8556a04ca7bbbe9c4fae1b1eb972152c0b7\": container with ID starting with be3cfd9284bc924e079e815dce6ec8556a04ca7bbbe9c4fae1b1eb972152c0b7 not found: ID does not exist" containerID="be3cfd9284bc924e079e815dce6ec8556a04ca7bbbe9c4fae1b1eb972152c0b7" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.512245 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be3cfd9284bc924e079e815dce6ec8556a04ca7bbbe9c4fae1b1eb972152c0b7"} err="failed to get container status \"be3cfd9284bc924e079e815dce6ec8556a04ca7bbbe9c4fae1b1eb972152c0b7\": rpc error: code = NotFound desc = could not find container \"be3cfd9284bc924e079e815dce6ec8556a04ca7bbbe9c4fae1b1eb972152c0b7\": container with ID starting with be3cfd9284bc924e079e815dce6ec8556a04ca7bbbe9c4fae1b1eb972152c0b7 not found: ID does not exist" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.512276 4804 scope.go:117] "RemoveContainer" containerID="694f46d3a17de98217359c48adef8a435392ec51f1c7919012624d49fd6b0165" Feb 17 13:48:46 crc kubenswrapper[4804]: E0217 13:48:46.515495 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"694f46d3a17de98217359c48adef8a435392ec51f1c7919012624d49fd6b0165\": container with ID starting with 694f46d3a17de98217359c48adef8a435392ec51f1c7919012624d49fd6b0165 not found: ID does not exist" containerID="694f46d3a17de98217359c48adef8a435392ec51f1c7919012624d49fd6b0165" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.515529 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"694f46d3a17de98217359c48adef8a435392ec51f1c7919012624d49fd6b0165"} err="failed to get container status \"694f46d3a17de98217359c48adef8a435392ec51f1c7919012624d49fd6b0165\": rpc error: code = NotFound desc = could not find container \"694f46d3a17de98217359c48adef8a435392ec51f1c7919012624d49fd6b0165\": container with ID starting with 694f46d3a17de98217359c48adef8a435392ec51f1c7919012624d49fd6b0165 not found: ID does not exist" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.524436 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="80e4a011-e72b-4fea-b6cb-15425d5d5940" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.524436 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="80e4a011-e72b-4fea-b6cb-15425d5d5940" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.621300 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pmp8r" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.627051 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-mtrxj"] Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.636655 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-mtrxj"] Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.688167 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6597adc7-fdae-4de0-99bc-87d9807f38f4-scripts\") pod \"6597adc7-fdae-4de0-99bc-87d9807f38f4\" (UID: \"6597adc7-fdae-4de0-99bc-87d9807f38f4\") " Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.688244 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6597adc7-fdae-4de0-99bc-87d9807f38f4-config-data\") pod \"6597adc7-fdae-4de0-99bc-87d9807f38f4\" (UID: \"6597adc7-fdae-4de0-99bc-87d9807f38f4\") " Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.688273 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8hvb\" (UniqueName: \"kubernetes.io/projected/6597adc7-fdae-4de0-99bc-87d9807f38f4-kube-api-access-j8hvb\") pod \"6597adc7-fdae-4de0-99bc-87d9807f38f4\" (UID: \"6597adc7-fdae-4de0-99bc-87d9807f38f4\") " Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.688361 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6597adc7-fdae-4de0-99bc-87d9807f38f4-combined-ca-bundle\") pod \"6597adc7-fdae-4de0-99bc-87d9807f38f4\" (UID: \"6597adc7-fdae-4de0-99bc-87d9807f38f4\") " Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.692152 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6597adc7-fdae-4de0-99bc-87d9807f38f4-kube-api-access-j8hvb" (OuterVolumeSpecName: "kube-api-access-j8hvb") pod "6597adc7-fdae-4de0-99bc-87d9807f38f4" (UID: "6597adc7-fdae-4de0-99bc-87d9807f38f4"). InnerVolumeSpecName "kube-api-access-j8hvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.692745 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6597adc7-fdae-4de0-99bc-87d9807f38f4-scripts" (OuterVolumeSpecName: "scripts") pod "6597adc7-fdae-4de0-99bc-87d9807f38f4" (UID: "6597adc7-fdae-4de0-99bc-87d9807f38f4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.714759 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6597adc7-fdae-4de0-99bc-87d9807f38f4-config-data" (OuterVolumeSpecName: "config-data") pod "6597adc7-fdae-4de0-99bc-87d9807f38f4" (UID: "6597adc7-fdae-4de0-99bc-87d9807f38f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.715836 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6597adc7-fdae-4de0-99bc-87d9807f38f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6597adc7-fdae-4de0-99bc-87d9807f38f4" (UID: "6597adc7-fdae-4de0-99bc-87d9807f38f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.790074 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6597adc7-fdae-4de0-99bc-87d9807f38f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.790354 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6597adc7-fdae-4de0-99bc-87d9807f38f4-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.790439 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6597adc7-fdae-4de0-99bc-87d9807f38f4-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:46 crc kubenswrapper[4804]: I0217 13:48:46.790514 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8hvb\" (UniqueName: \"kubernetes.io/projected/6597adc7-fdae-4de0-99bc-87d9807f38f4-kube-api-access-j8hvb\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:47 crc kubenswrapper[4804]: I0217 13:48:47.299524 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pmp8r" event={"ID":"6597adc7-fdae-4de0-99bc-87d9807f38f4","Type":"ContainerDied","Data":"b65e726ad0fd58fbc98c718204a9a6619e848272b9dfc249b9a1897ff310c04a"} Feb 17 13:48:47 crc kubenswrapper[4804]: I0217 13:48:47.300856 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b65e726ad0fd58fbc98c718204a9a6619e848272b9dfc249b9a1897ff310c04a" Feb 17 13:48:47 crc kubenswrapper[4804]: I0217 13:48:47.299604 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pmp8r" Feb 17 13:48:47 crc kubenswrapper[4804]: I0217 13:48:47.405038 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 13:48:47 crc kubenswrapper[4804]: I0217 13:48:47.405313 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="80e4a011-e72b-4fea-b6cb-15425d5d5940" containerName="nova-api-log" containerID="cri-o://23b101efea3a1f5b40b2a37bc5a802b2bc03e09dfe00e3188317dd5155cb9d8f" gracePeriod=30 Feb 17 13:48:47 crc kubenswrapper[4804]: I0217 13:48:47.405764 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="80e4a011-e72b-4fea-b6cb-15425d5d5940" containerName="nova-api-api" containerID="cri-o://43a7ab1b51a9771eea24f4ca58195ac019c4d4f576bf8a9167479dd864686f01" gracePeriod=30 Feb 17 13:48:47 crc kubenswrapper[4804]: I0217 13:48:47.423930 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 13:48:47 crc kubenswrapper[4804]: I0217 13:48:47.424415 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="43796f1c-9838-40a1-9829-f878c2a7f076" containerName="nova-scheduler-scheduler" containerID="cri-o://def8cc35b8a0527a15ca9c8261b86f48b99c8454d4a98322ec91fe520a6a815f" gracePeriod=30 Feb 17 13:48:47 crc kubenswrapper[4804]: I0217 13:48:47.441109 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:48:47 crc kubenswrapper[4804]: I0217 13:48:47.442320 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21" containerName="nova-metadata-log" containerID="cri-o://316bd5b973dc2ed5f32629f896a12fc942acb61c9a942863b5059f67e620eb3d" gracePeriod=30 Feb 17 13:48:47 crc kubenswrapper[4804]: I0217 13:48:47.442360 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21" containerName="nova-metadata-metadata" containerID="cri-o://1b656928bc960a912a0d365195455c69ef491a1d2cf6faba1db06e6aca0a539c" gracePeriod=30 Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.062080 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.213284 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-logs\") pod \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\" (UID: \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\") " Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.213376 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-nova-metadata-tls-certs\") pod \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\" (UID: \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\") " Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.213505 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgqgh\" (UniqueName: \"kubernetes.io/projected/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-kube-api-access-fgqgh\") pod \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\" (UID: \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\") " Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.213549 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-combined-ca-bundle\") pod \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\" (UID: \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\") " Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.213573 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-config-data\") pod \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\" (UID: \"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21\") " Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.213737 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-logs" (OuterVolumeSpecName: "logs") pod "63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21" (UID: "63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.214413 4804 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-logs\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.220478 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-kube-api-access-fgqgh" (OuterVolumeSpecName: "kube-api-access-fgqgh") pod "63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21" (UID: "63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21"). InnerVolumeSpecName "kube-api-access-fgqgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.241270 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21" (UID: "63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.250601 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-config-data" (OuterVolumeSpecName: "config-data") pod "63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21" (UID: "63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.278631 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21" (UID: "63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.317056 4804 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.317108 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgqgh\" (UniqueName: \"kubernetes.io/projected/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-kube-api-access-fgqgh\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.317127 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.317139 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.331187 4804 generic.go:334] "Generic (PLEG): container finished" podID="63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21" containerID="1b656928bc960a912a0d365195455c69ef491a1d2cf6faba1db06e6aca0a539c" exitCode=0 Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.331236 4804 generic.go:334] "Generic (PLEG): container finished" podID="63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21" containerID="316bd5b973dc2ed5f32629f896a12fc942acb61c9a942863b5059f67e620eb3d" exitCode=143 Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.331274 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.331287 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21","Type":"ContainerDied","Data":"1b656928bc960a912a0d365195455c69ef491a1d2cf6faba1db06e6aca0a539c"} Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.331375 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21","Type":"ContainerDied","Data":"316bd5b973dc2ed5f32629f896a12fc942acb61c9a942863b5059f67e620eb3d"} Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.331392 4804 scope.go:117] "RemoveContainer" containerID="1b656928bc960a912a0d365195455c69ef491a1d2cf6faba1db06e6aca0a539c" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.331397 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21","Type":"ContainerDied","Data":"981aac18a07ba416bc920d67be4b035029798cd9a4dd9ffa83de28fefc1ded2e"} Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.333455 4804 generic.go:334] "Generic (PLEG): container finished" podID="80e4a011-e72b-4fea-b6cb-15425d5d5940" containerID="23b101efea3a1f5b40b2a37bc5a802b2bc03e09dfe00e3188317dd5155cb9d8f" exitCode=143 Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.333496 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"80e4a011-e72b-4fea-b6cb-15425d5d5940","Type":"ContainerDied","Data":"23b101efea3a1f5b40b2a37bc5a802b2bc03e09dfe00e3188317dd5155cb9d8f"} Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.376905 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.378353 4804 scope.go:117] "RemoveContainer" containerID="316bd5b973dc2ed5f32629f896a12fc942acb61c9a942863b5059f67e620eb3d" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.392469 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.411031 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:48:48 crc kubenswrapper[4804]: E0217 13:48:48.411655 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="737ac1d8-ad22-4a56-b203-eb2212949fb6" containerName="dnsmasq-dns" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.411693 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="737ac1d8-ad22-4a56-b203-eb2212949fb6" containerName="dnsmasq-dns" Feb 17 13:48:48 crc kubenswrapper[4804]: E0217 13:48:48.411708 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21" containerName="nova-metadata-metadata" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.411717 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21" containerName="nova-metadata-metadata" Feb 17 13:48:48 crc kubenswrapper[4804]: E0217 13:48:48.411740 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="737ac1d8-ad22-4a56-b203-eb2212949fb6" containerName="init" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.411746 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="737ac1d8-ad22-4a56-b203-eb2212949fb6" containerName="init" Feb 17 13:48:48 crc kubenswrapper[4804]: E0217 13:48:48.411776 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21" containerName="nova-metadata-log" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.411783 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21" containerName="nova-metadata-log" Feb 17 13:48:48 crc kubenswrapper[4804]: E0217 13:48:48.411806 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6597adc7-fdae-4de0-99bc-87d9807f38f4" containerName="nova-manage" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.411813 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="6597adc7-fdae-4de0-99bc-87d9807f38f4" containerName="nova-manage" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.412061 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="737ac1d8-ad22-4a56-b203-eb2212949fb6" containerName="dnsmasq-dns" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.412096 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21" containerName="nova-metadata-log" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.412112 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21" containerName="nova-metadata-metadata" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.412123 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="6597adc7-fdae-4de0-99bc-87d9807f38f4" containerName="nova-manage" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.413650 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.417056 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.419242 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.419383 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.425169 4804 scope.go:117] "RemoveContainer" containerID="1b656928bc960a912a0d365195455c69ef491a1d2cf6faba1db06e6aca0a539c" Feb 17 13:48:48 crc kubenswrapper[4804]: E0217 13:48:48.426756 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b656928bc960a912a0d365195455c69ef491a1d2cf6faba1db06e6aca0a539c\": container with ID starting with 1b656928bc960a912a0d365195455c69ef491a1d2cf6faba1db06e6aca0a539c not found: ID does not exist" containerID="1b656928bc960a912a0d365195455c69ef491a1d2cf6faba1db06e6aca0a539c" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.426788 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b656928bc960a912a0d365195455c69ef491a1d2cf6faba1db06e6aca0a539c"} err="failed to get container status \"1b656928bc960a912a0d365195455c69ef491a1d2cf6faba1db06e6aca0a539c\": rpc error: code = NotFound desc = could not find container \"1b656928bc960a912a0d365195455c69ef491a1d2cf6faba1db06e6aca0a539c\": container with ID starting with 1b656928bc960a912a0d365195455c69ef491a1d2cf6faba1db06e6aca0a539c not found: ID does not exist" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.426814 4804 scope.go:117] "RemoveContainer" containerID="316bd5b973dc2ed5f32629f896a12fc942acb61c9a942863b5059f67e620eb3d" Feb 17 13:48:48 crc kubenswrapper[4804]: E0217 13:48:48.430885 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"316bd5b973dc2ed5f32629f896a12fc942acb61c9a942863b5059f67e620eb3d\": container with ID starting with 316bd5b973dc2ed5f32629f896a12fc942acb61c9a942863b5059f67e620eb3d not found: ID does not exist" containerID="316bd5b973dc2ed5f32629f896a12fc942acb61c9a942863b5059f67e620eb3d" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.431088 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"316bd5b973dc2ed5f32629f896a12fc942acb61c9a942863b5059f67e620eb3d"} err="failed to get container status \"316bd5b973dc2ed5f32629f896a12fc942acb61c9a942863b5059f67e620eb3d\": rpc error: code = NotFound desc = could not find container \"316bd5b973dc2ed5f32629f896a12fc942acb61c9a942863b5059f67e620eb3d\": container with ID starting with 316bd5b973dc2ed5f32629f896a12fc942acb61c9a942863b5059f67e620eb3d not found: ID does not exist" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.431119 4804 scope.go:117] "RemoveContainer" containerID="1b656928bc960a912a0d365195455c69ef491a1d2cf6faba1db06e6aca0a539c" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.431982 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b656928bc960a912a0d365195455c69ef491a1d2cf6faba1db06e6aca0a539c"} err="failed to get container status \"1b656928bc960a912a0d365195455c69ef491a1d2cf6faba1db06e6aca0a539c\": rpc error: code = NotFound desc = could not find container \"1b656928bc960a912a0d365195455c69ef491a1d2cf6faba1db06e6aca0a539c\": container with ID starting with 1b656928bc960a912a0d365195455c69ef491a1d2cf6faba1db06e6aca0a539c not found: ID does not exist" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.432016 4804 scope.go:117] "RemoveContainer" containerID="316bd5b973dc2ed5f32629f896a12fc942acb61c9a942863b5059f67e620eb3d" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.432374 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"316bd5b973dc2ed5f32629f896a12fc942acb61c9a942863b5059f67e620eb3d"} err="failed to get container status \"316bd5b973dc2ed5f32629f896a12fc942acb61c9a942863b5059f67e620eb3d\": rpc error: code = NotFound desc = could not find container \"316bd5b973dc2ed5f32629f896a12fc942acb61c9a942863b5059f67e620eb3d\": container with ID starting with 316bd5b973dc2ed5f32629f896a12fc942acb61c9a942863b5059f67e620eb3d not found: ID does not exist" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.520487 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa87191a-671d-43c8-b8c2-e5e07a54af02-config-data\") pod \"nova-metadata-0\" (UID: \"aa87191a-671d-43c8-b8c2-e5e07a54af02\") " pod="openstack/nova-metadata-0" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.520565 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa87191a-671d-43c8-b8c2-e5e07a54af02-logs\") pod \"nova-metadata-0\" (UID: \"aa87191a-671d-43c8-b8c2-e5e07a54af02\") " pod="openstack/nova-metadata-0" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.520620 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa87191a-671d-43c8-b8c2-e5e07a54af02-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"aa87191a-671d-43c8-b8c2-e5e07a54af02\") " pod="openstack/nova-metadata-0" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.520704 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa87191a-671d-43c8-b8c2-e5e07a54af02-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"aa87191a-671d-43c8-b8c2-e5e07a54af02\") " pod="openstack/nova-metadata-0" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.520773 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqplt\" (UniqueName: \"kubernetes.io/projected/aa87191a-671d-43c8-b8c2-e5e07a54af02-kube-api-access-rqplt\") pod \"nova-metadata-0\" (UID: \"aa87191a-671d-43c8-b8c2-e5e07a54af02\") " pod="openstack/nova-metadata-0" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.586408 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21" path="/var/lib/kubelet/pods/63eb831f-e10b-4c76-9e6d-3a8b2c5ecb21/volumes" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.587109 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="737ac1d8-ad22-4a56-b203-eb2212949fb6" path="/var/lib/kubelet/pods/737ac1d8-ad22-4a56-b203-eb2212949fb6/volumes" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.623269 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa87191a-671d-43c8-b8c2-e5e07a54af02-config-data\") pod \"nova-metadata-0\" (UID: \"aa87191a-671d-43c8-b8c2-e5e07a54af02\") " pod="openstack/nova-metadata-0" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.623392 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa87191a-671d-43c8-b8c2-e5e07a54af02-logs\") pod \"nova-metadata-0\" (UID: \"aa87191a-671d-43c8-b8c2-e5e07a54af02\") " pod="openstack/nova-metadata-0" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.623491 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa87191a-671d-43c8-b8c2-e5e07a54af02-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"aa87191a-671d-43c8-b8c2-e5e07a54af02\") " pod="openstack/nova-metadata-0" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.623595 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa87191a-671d-43c8-b8c2-e5e07a54af02-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"aa87191a-671d-43c8-b8c2-e5e07a54af02\") " pod="openstack/nova-metadata-0" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.623706 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqplt\" (UniqueName: \"kubernetes.io/projected/aa87191a-671d-43c8-b8c2-e5e07a54af02-kube-api-access-rqplt\") pod \"nova-metadata-0\" (UID: \"aa87191a-671d-43c8-b8c2-e5e07a54af02\") " pod="openstack/nova-metadata-0" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.623895 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa87191a-671d-43c8-b8c2-e5e07a54af02-logs\") pod \"nova-metadata-0\" (UID: \"aa87191a-671d-43c8-b8c2-e5e07a54af02\") " pod="openstack/nova-metadata-0" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.626854 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa87191a-671d-43c8-b8c2-e5e07a54af02-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"aa87191a-671d-43c8-b8c2-e5e07a54af02\") " pod="openstack/nova-metadata-0" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.626956 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa87191a-671d-43c8-b8c2-e5e07a54af02-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"aa87191a-671d-43c8-b8c2-e5e07a54af02\") " pod="openstack/nova-metadata-0" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.627218 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa87191a-671d-43c8-b8c2-e5e07a54af02-config-data\") pod \"nova-metadata-0\" (UID: \"aa87191a-671d-43c8-b8c2-e5e07a54af02\") " pod="openstack/nova-metadata-0" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.640920 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqplt\" (UniqueName: \"kubernetes.io/projected/aa87191a-671d-43c8-b8c2-e5e07a54af02-kube-api-access-rqplt\") pod \"nova-metadata-0\" (UID: \"aa87191a-671d-43c8-b8c2-e5e07a54af02\") " pod="openstack/nova-metadata-0" Feb 17 13:48:48 crc kubenswrapper[4804]: I0217 13:48:48.735776 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 13:48:49 crc kubenswrapper[4804]: I0217 13:48:49.192742 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:48:49 crc kubenswrapper[4804]: W0217 13:48:49.194259 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa87191a_671d_43c8_b8c2_e5e07a54af02.slice/crio-7c478cf9f4ed2e396f528fdf22823fcf8ddf5f04f1a3c2774ead4eafb4cdd61a WatchSource:0}: Error finding container 7c478cf9f4ed2e396f528fdf22823fcf8ddf5f04f1a3c2774ead4eafb4cdd61a: Status 404 returned error can't find the container with id 7c478cf9f4ed2e396f528fdf22823fcf8ddf5f04f1a3c2774ead4eafb4cdd61a Feb 17 13:48:49 crc kubenswrapper[4804]: I0217 13:48:49.343118 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aa87191a-671d-43c8-b8c2-e5e07a54af02","Type":"ContainerStarted","Data":"7c478cf9f4ed2e396f528fdf22823fcf8ddf5f04f1a3c2774ead4eafb4cdd61a"} Feb 17 13:48:49 crc kubenswrapper[4804]: I0217 13:48:49.934338 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.050312 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43796f1c-9838-40a1-9829-f878c2a7f076-config-data\") pod \"43796f1c-9838-40a1-9829-f878c2a7f076\" (UID: \"43796f1c-9838-40a1-9829-f878c2a7f076\") " Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.050697 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnbnx\" (UniqueName: \"kubernetes.io/projected/43796f1c-9838-40a1-9829-f878c2a7f076-kube-api-access-hnbnx\") pod \"43796f1c-9838-40a1-9829-f878c2a7f076\" (UID: \"43796f1c-9838-40a1-9829-f878c2a7f076\") " Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.050808 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43796f1c-9838-40a1-9829-f878c2a7f076-combined-ca-bundle\") pod \"43796f1c-9838-40a1-9829-f878c2a7f076\" (UID: \"43796f1c-9838-40a1-9829-f878c2a7f076\") " Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.056804 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43796f1c-9838-40a1-9829-f878c2a7f076-kube-api-access-hnbnx" (OuterVolumeSpecName: "kube-api-access-hnbnx") pod "43796f1c-9838-40a1-9829-f878c2a7f076" (UID: "43796f1c-9838-40a1-9829-f878c2a7f076"). InnerVolumeSpecName "kube-api-access-hnbnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.100331 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43796f1c-9838-40a1-9829-f878c2a7f076-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43796f1c-9838-40a1-9829-f878c2a7f076" (UID: "43796f1c-9838-40a1-9829-f878c2a7f076"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.100373 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43796f1c-9838-40a1-9829-f878c2a7f076-config-data" (OuterVolumeSpecName: "config-data") pod "43796f1c-9838-40a1-9829-f878c2a7f076" (UID: "43796f1c-9838-40a1-9829-f878c2a7f076"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.152851 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43796f1c-9838-40a1-9829-f878c2a7f076-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.152900 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43796f1c-9838-40a1-9829-f878c2a7f076-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.152913 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnbnx\" (UniqueName: \"kubernetes.io/projected/43796f1c-9838-40a1-9829-f878c2a7f076-kube-api-access-hnbnx\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.370893 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aa87191a-671d-43c8-b8c2-e5e07a54af02","Type":"ContainerStarted","Data":"f14b58f1aab61a4641abd8e2099263a3e20347428156e830e60d2bb792f1efe5"} Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.370963 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aa87191a-671d-43c8-b8c2-e5e07a54af02","Type":"ContainerStarted","Data":"10192fda379cac904a9add75091edd3fa4788176b8f596974a82c2b3ace68679"} Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.374805 4804 generic.go:334] "Generic (PLEG): container finished" podID="43796f1c-9838-40a1-9829-f878c2a7f076" containerID="def8cc35b8a0527a15ca9c8261b86f48b99c8454d4a98322ec91fe520a6a815f" exitCode=0 Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.374852 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"43796f1c-9838-40a1-9829-f878c2a7f076","Type":"ContainerDied","Data":"def8cc35b8a0527a15ca9c8261b86f48b99c8454d4a98322ec91fe520a6a815f"} Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.374899 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"43796f1c-9838-40a1-9829-f878c2a7f076","Type":"ContainerDied","Data":"76fb016395e0231c3d8a7ae1865fe7cb74985c931d1b06a2d2e7c2491c7f5dbe"} Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.374917 4804 scope.go:117] "RemoveContainer" containerID="def8cc35b8a0527a15ca9c8261b86f48b99c8454d4a98322ec91fe520a6a815f" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.374922 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.378934 4804 generic.go:334] "Generic (PLEG): container finished" podID="c11e165e-2605-470a-a865-230b274ce8d3" containerID="24aef71ff922a8ddea4d7c3429161120ea76c5281b5a5f51b9b913d40e9cb137" exitCode=0 Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.379125 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wq5kj" event={"ID":"c11e165e-2605-470a-a865-230b274ce8d3","Type":"ContainerDied","Data":"24aef71ff922a8ddea4d7c3429161120ea76c5281b5a5f51b9b913d40e9cb137"} Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.399991 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.399973515 podStartE2EDuration="2.399973515s" podCreationTimestamp="2026-02-17 13:48:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:48:50.389470075 +0000 UTC m=+1404.500889432" watchObservedRunningTime="2026-02-17 13:48:50.399973515 +0000 UTC m=+1404.511392852" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.400141 4804 scope.go:117] "RemoveContainer" containerID="def8cc35b8a0527a15ca9c8261b86f48b99c8454d4a98322ec91fe520a6a815f" Feb 17 13:48:50 crc kubenswrapper[4804]: E0217 13:48:50.400522 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"def8cc35b8a0527a15ca9c8261b86f48b99c8454d4a98322ec91fe520a6a815f\": container with ID starting with def8cc35b8a0527a15ca9c8261b86f48b99c8454d4a98322ec91fe520a6a815f not found: ID does not exist" containerID="def8cc35b8a0527a15ca9c8261b86f48b99c8454d4a98322ec91fe520a6a815f" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.400548 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"def8cc35b8a0527a15ca9c8261b86f48b99c8454d4a98322ec91fe520a6a815f"} err="failed to get container status \"def8cc35b8a0527a15ca9c8261b86f48b99c8454d4a98322ec91fe520a6a815f\": rpc error: code = NotFound desc = could not find container \"def8cc35b8a0527a15ca9c8261b86f48b99c8454d4a98322ec91fe520a6a815f\": container with ID starting with def8cc35b8a0527a15ca9c8261b86f48b99c8454d4a98322ec91fe520a6a815f not found: ID does not exist" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.442394 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.452493 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.467691 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 13:48:50 crc kubenswrapper[4804]: E0217 13:48:50.468038 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43796f1c-9838-40a1-9829-f878c2a7f076" containerName="nova-scheduler-scheduler" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.468058 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="43796f1c-9838-40a1-9829-f878c2a7f076" containerName="nova-scheduler-scheduler" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.468279 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="43796f1c-9838-40a1-9829-f878c2a7f076" containerName="nova-scheduler-scheduler" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.468929 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.472214 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.481369 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.586461 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43796f1c-9838-40a1-9829-f878c2a7f076" path="/var/lib/kubelet/pods/43796f1c-9838-40a1-9829-f878c2a7f076/volumes" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.661509 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f65hb\" (UniqueName: \"kubernetes.io/projected/abf6d166-4c3f-4fb3-a3b5-f85d47adf823-kube-api-access-f65hb\") pod \"nova-scheduler-0\" (UID: \"abf6d166-4c3f-4fb3-a3b5-f85d47adf823\") " pod="openstack/nova-scheduler-0" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.661819 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf6d166-4c3f-4fb3-a3b5-f85d47adf823-config-data\") pod \"nova-scheduler-0\" (UID: \"abf6d166-4c3f-4fb3-a3b5-f85d47adf823\") " pod="openstack/nova-scheduler-0" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.662167 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf6d166-4c3f-4fb3-a3b5-f85d47adf823-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"abf6d166-4c3f-4fb3-a3b5-f85d47adf823\") " pod="openstack/nova-scheduler-0" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.764135 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f65hb\" (UniqueName: \"kubernetes.io/projected/abf6d166-4c3f-4fb3-a3b5-f85d47adf823-kube-api-access-f65hb\") pod \"nova-scheduler-0\" (UID: \"abf6d166-4c3f-4fb3-a3b5-f85d47adf823\") " pod="openstack/nova-scheduler-0" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.764984 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf6d166-4c3f-4fb3-a3b5-f85d47adf823-config-data\") pod \"nova-scheduler-0\" (UID: \"abf6d166-4c3f-4fb3-a3b5-f85d47adf823\") " pod="openstack/nova-scheduler-0" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.766520 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf6d166-4c3f-4fb3-a3b5-f85d47adf823-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"abf6d166-4c3f-4fb3-a3b5-f85d47adf823\") " pod="openstack/nova-scheduler-0" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.769070 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf6d166-4c3f-4fb3-a3b5-f85d47adf823-config-data\") pod \"nova-scheduler-0\" (UID: \"abf6d166-4c3f-4fb3-a3b5-f85d47adf823\") " pod="openstack/nova-scheduler-0" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.769231 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf6d166-4c3f-4fb3-a3b5-f85d47adf823-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"abf6d166-4c3f-4fb3-a3b5-f85d47adf823\") " pod="openstack/nova-scheduler-0" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.783941 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f65hb\" (UniqueName: \"kubernetes.io/projected/abf6d166-4c3f-4fb3-a3b5-f85d47adf823-kube-api-access-f65hb\") pod \"nova-scheduler-0\" (UID: \"abf6d166-4c3f-4fb3-a3b5-f85d47adf823\") " pod="openstack/nova-scheduler-0" Feb 17 13:48:50 crc kubenswrapper[4804]: I0217 13:48:50.793060 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 13:48:51 crc kubenswrapper[4804]: I0217 13:48:51.244443 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 13:48:51 crc kubenswrapper[4804]: I0217 13:48:51.389893 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"abf6d166-4c3f-4fb3-a3b5-f85d47adf823","Type":"ContainerStarted","Data":"45d059e86e213177abef9a85b8685f82c73749ae5fde7098a9e718ebf9c0ae93"} Feb 17 13:48:51 crc kubenswrapper[4804]: I0217 13:48:51.729060 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wq5kj" Feb 17 13:48:51 crc kubenswrapper[4804]: I0217 13:48:51.835659 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c11e165e-2605-470a-a865-230b274ce8d3-scripts\") pod \"c11e165e-2605-470a-a865-230b274ce8d3\" (UID: \"c11e165e-2605-470a-a865-230b274ce8d3\") " Feb 17 13:48:51 crc kubenswrapper[4804]: I0217 13:48:51.835702 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c11e165e-2605-470a-a865-230b274ce8d3-config-data\") pod \"c11e165e-2605-470a-a865-230b274ce8d3\" (UID: \"c11e165e-2605-470a-a865-230b274ce8d3\") " Feb 17 13:48:51 crc kubenswrapper[4804]: I0217 13:48:51.835769 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smxf4\" (UniqueName: \"kubernetes.io/projected/c11e165e-2605-470a-a865-230b274ce8d3-kube-api-access-smxf4\") pod \"c11e165e-2605-470a-a865-230b274ce8d3\" (UID: \"c11e165e-2605-470a-a865-230b274ce8d3\") " Feb 17 13:48:51 crc kubenswrapper[4804]: I0217 13:48:51.835847 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c11e165e-2605-470a-a865-230b274ce8d3-combined-ca-bundle\") pod \"c11e165e-2605-470a-a865-230b274ce8d3\" (UID: \"c11e165e-2605-470a-a865-230b274ce8d3\") " Feb 17 13:48:51 crc kubenswrapper[4804]: I0217 13:48:51.839967 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c11e165e-2605-470a-a865-230b274ce8d3-kube-api-access-smxf4" (OuterVolumeSpecName: "kube-api-access-smxf4") pod "c11e165e-2605-470a-a865-230b274ce8d3" (UID: "c11e165e-2605-470a-a865-230b274ce8d3"). InnerVolumeSpecName "kube-api-access-smxf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:48:51 crc kubenswrapper[4804]: I0217 13:48:51.840260 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c11e165e-2605-470a-a865-230b274ce8d3-scripts" (OuterVolumeSpecName: "scripts") pod "c11e165e-2605-470a-a865-230b274ce8d3" (UID: "c11e165e-2605-470a-a865-230b274ce8d3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:51 crc kubenswrapper[4804]: I0217 13:48:51.883421 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c11e165e-2605-470a-a865-230b274ce8d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c11e165e-2605-470a-a865-230b274ce8d3" (UID: "c11e165e-2605-470a-a865-230b274ce8d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:51 crc kubenswrapper[4804]: I0217 13:48:51.883800 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c11e165e-2605-470a-a865-230b274ce8d3-config-data" (OuterVolumeSpecName: "config-data") pod "c11e165e-2605-470a-a865-230b274ce8d3" (UID: "c11e165e-2605-470a-a865-230b274ce8d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:51 crc kubenswrapper[4804]: I0217 13:48:51.937743 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c11e165e-2605-470a-a865-230b274ce8d3-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:51 crc kubenswrapper[4804]: I0217 13:48:51.937782 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c11e165e-2605-470a-a865-230b274ce8d3-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:51 crc kubenswrapper[4804]: I0217 13:48:51.937798 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smxf4\" (UniqueName: \"kubernetes.io/projected/c11e165e-2605-470a-a865-230b274ce8d3-kube-api-access-smxf4\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:51 crc kubenswrapper[4804]: I0217 13:48:51.937811 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c11e165e-2605-470a-a865-230b274ce8d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.378179 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.402157 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wq5kj" event={"ID":"c11e165e-2605-470a-a865-230b274ce8d3","Type":"ContainerDied","Data":"e2acfd4d07f2a376a865d19f0462c02776c9702874f6443998a2d4c2b54946eb"} Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.402190 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wq5kj" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.402219 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2acfd4d07f2a376a865d19f0462c02776c9702874f6443998a2d4c2b54946eb" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.404176 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"abf6d166-4c3f-4fb3-a3b5-f85d47adf823","Type":"ContainerStarted","Data":"74abf0adfaf756459596c8077a84021698f54ece715646599a6f465fcabf500a"} Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.407734 4804 generic.go:334] "Generic (PLEG): container finished" podID="80e4a011-e72b-4fea-b6cb-15425d5d5940" containerID="43a7ab1b51a9771eea24f4ca58195ac019c4d4f576bf8a9167479dd864686f01" exitCode=0 Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.407782 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"80e4a011-e72b-4fea-b6cb-15425d5d5940","Type":"ContainerDied","Data":"43a7ab1b51a9771eea24f4ca58195ac019c4d4f576bf8a9167479dd864686f01"} Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.407814 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"80e4a011-e72b-4fea-b6cb-15425d5d5940","Type":"ContainerDied","Data":"f4250a31bbaa8fb75b13d25a5ac1d90d53745e9ab78a95f6bcc9ab9e9e16fb06"} Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.407811 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.407830 4804 scope.go:117] "RemoveContainer" containerID="43a7ab1b51a9771eea24f4ca58195ac019c4d4f576bf8a9167479dd864686f01" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.440678 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.440654061 podStartE2EDuration="2.440654061s" podCreationTimestamp="2026-02-17 13:48:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:48:52.430697808 +0000 UTC m=+1406.542117155" watchObservedRunningTime="2026-02-17 13:48:52.440654061 +0000 UTC m=+1406.552073398" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.449281 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80e4a011-e72b-4fea-b6cb-15425d5d5940-logs\") pod \"80e4a011-e72b-4fea-b6cb-15425d5d5940\" (UID: \"80e4a011-e72b-4fea-b6cb-15425d5d5940\") " Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.450831 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80e4a011-e72b-4fea-b6cb-15425d5d5940-logs" (OuterVolumeSpecName: "logs") pod "80e4a011-e72b-4fea-b6cb-15425d5d5940" (UID: "80e4a011-e72b-4fea-b6cb-15425d5d5940"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.452050 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80e4a011-e72b-4fea-b6cb-15425d5d5940-combined-ca-bundle\") pod \"80e4a011-e72b-4fea-b6cb-15425d5d5940\" (UID: \"80e4a011-e72b-4fea-b6cb-15425d5d5940\") " Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.452211 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80e4a011-e72b-4fea-b6cb-15425d5d5940-config-data\") pod \"80e4a011-e72b-4fea-b6cb-15425d5d5940\" (UID: \"80e4a011-e72b-4fea-b6cb-15425d5d5940\") " Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.452345 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gptwq\" (UniqueName: \"kubernetes.io/projected/80e4a011-e72b-4fea-b6cb-15425d5d5940-kube-api-access-gptwq\") pod \"80e4a011-e72b-4fea-b6cb-15425d5d5940\" (UID: \"80e4a011-e72b-4fea-b6cb-15425d5d5940\") " Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.453792 4804 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80e4a011-e72b-4fea-b6cb-15425d5d5940-logs\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.460446 4804 scope.go:117] "RemoveContainer" containerID="23b101efea3a1f5b40b2a37bc5a802b2bc03e09dfe00e3188317dd5155cb9d8f" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.462073 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80e4a011-e72b-4fea-b6cb-15425d5d5940-kube-api-access-gptwq" (OuterVolumeSpecName: "kube-api-access-gptwq") pod "80e4a011-e72b-4fea-b6cb-15425d5d5940" (UID: "80e4a011-e72b-4fea-b6cb-15425d5d5940"). InnerVolumeSpecName "kube-api-access-gptwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.509464 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80e4a011-e72b-4fea-b6cb-15425d5d5940-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "80e4a011-e72b-4fea-b6cb-15425d5d5940" (UID: "80e4a011-e72b-4fea-b6cb-15425d5d5940"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.514059 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80e4a011-e72b-4fea-b6cb-15425d5d5940-config-data" (OuterVolumeSpecName: "config-data") pod "80e4a011-e72b-4fea-b6cb-15425d5d5940" (UID: "80e4a011-e72b-4fea-b6cb-15425d5d5940"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.527335 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 13:48:52 crc kubenswrapper[4804]: E0217 13:48:52.527872 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80e4a011-e72b-4fea-b6cb-15425d5d5940" containerName="nova-api-api" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.527897 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="80e4a011-e72b-4fea-b6cb-15425d5d5940" containerName="nova-api-api" Feb 17 13:48:52 crc kubenswrapper[4804]: E0217 13:48:52.527913 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80e4a011-e72b-4fea-b6cb-15425d5d5940" containerName="nova-api-log" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.527920 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="80e4a011-e72b-4fea-b6cb-15425d5d5940" containerName="nova-api-log" Feb 17 13:48:52 crc kubenswrapper[4804]: E0217 13:48:52.527929 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c11e165e-2605-470a-a865-230b274ce8d3" containerName="nova-cell1-conductor-db-sync" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.527936 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="c11e165e-2605-470a-a865-230b274ce8d3" containerName="nova-cell1-conductor-db-sync" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.528237 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="80e4a011-e72b-4fea-b6cb-15425d5d5940" containerName="nova-api-api" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.528260 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="80e4a011-e72b-4fea-b6cb-15425d5d5940" containerName="nova-api-log" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.528272 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="c11e165e-2605-470a-a865-230b274ce8d3" containerName="nova-cell1-conductor-db-sync" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.529026 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.532470 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.536947 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.556768 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80e4a011-e72b-4fea-b6cb-15425d5d5940-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.556821 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80e4a011-e72b-4fea-b6cb-15425d5d5940-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.556831 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gptwq\" (UniqueName: \"kubernetes.io/projected/80e4a011-e72b-4fea-b6cb-15425d5d5940-kube-api-access-gptwq\") on node \"crc\" DevicePath \"\"" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.587022 4804 scope.go:117] "RemoveContainer" containerID="43a7ab1b51a9771eea24f4ca58195ac019c4d4f576bf8a9167479dd864686f01" Feb 17 13:48:52 crc kubenswrapper[4804]: E0217 13:48:52.587439 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43a7ab1b51a9771eea24f4ca58195ac019c4d4f576bf8a9167479dd864686f01\": container with ID starting with 43a7ab1b51a9771eea24f4ca58195ac019c4d4f576bf8a9167479dd864686f01 not found: ID does not exist" containerID="43a7ab1b51a9771eea24f4ca58195ac019c4d4f576bf8a9167479dd864686f01" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.587488 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43a7ab1b51a9771eea24f4ca58195ac019c4d4f576bf8a9167479dd864686f01"} err="failed to get container status \"43a7ab1b51a9771eea24f4ca58195ac019c4d4f576bf8a9167479dd864686f01\": rpc error: code = NotFound desc = could not find container \"43a7ab1b51a9771eea24f4ca58195ac019c4d4f576bf8a9167479dd864686f01\": container with ID starting with 43a7ab1b51a9771eea24f4ca58195ac019c4d4f576bf8a9167479dd864686f01 not found: ID does not exist" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.587517 4804 scope.go:117] "RemoveContainer" containerID="23b101efea3a1f5b40b2a37bc5a802b2bc03e09dfe00e3188317dd5155cb9d8f" Feb 17 13:48:52 crc kubenswrapper[4804]: E0217 13:48:52.587847 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23b101efea3a1f5b40b2a37bc5a802b2bc03e09dfe00e3188317dd5155cb9d8f\": container with ID starting with 23b101efea3a1f5b40b2a37bc5a802b2bc03e09dfe00e3188317dd5155cb9d8f not found: ID does not exist" containerID="23b101efea3a1f5b40b2a37bc5a802b2bc03e09dfe00e3188317dd5155cb9d8f" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.587915 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23b101efea3a1f5b40b2a37bc5a802b2bc03e09dfe00e3188317dd5155cb9d8f"} err="failed to get container status \"23b101efea3a1f5b40b2a37bc5a802b2bc03e09dfe00e3188317dd5155cb9d8f\": rpc error: code = NotFound desc = could not find container \"23b101efea3a1f5b40b2a37bc5a802b2bc03e09dfe00e3188317dd5155cb9d8f\": container with ID starting with 23b101efea3a1f5b40b2a37bc5a802b2bc03e09dfe00e3188317dd5155cb9d8f not found: ID does not exist" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.658940 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbzvp\" (UniqueName: \"kubernetes.io/projected/a13dbc73-75fc-448b-af44-cb7018d1640e-kube-api-access-nbzvp\") pod \"nova-cell1-conductor-0\" (UID: \"a13dbc73-75fc-448b-af44-cb7018d1640e\") " pod="openstack/nova-cell1-conductor-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.660043 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a13dbc73-75fc-448b-af44-cb7018d1640e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a13dbc73-75fc-448b-af44-cb7018d1640e\") " pod="openstack/nova-cell1-conductor-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.661733 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a13dbc73-75fc-448b-af44-cb7018d1640e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a13dbc73-75fc-448b-af44-cb7018d1640e\") " pod="openstack/nova-cell1-conductor-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.744763 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.753566 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.766992 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.768737 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a13dbc73-75fc-448b-af44-cb7018d1640e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a13dbc73-75fc-448b-af44-cb7018d1640e\") " pod="openstack/nova-cell1-conductor-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.768837 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a13dbc73-75fc-448b-af44-cb7018d1640e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a13dbc73-75fc-448b-af44-cb7018d1640e\") " pod="openstack/nova-cell1-conductor-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.768894 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbzvp\" (UniqueName: \"kubernetes.io/projected/a13dbc73-75fc-448b-af44-cb7018d1640e-kube-api-access-nbzvp\") pod \"nova-cell1-conductor-0\" (UID: \"a13dbc73-75fc-448b-af44-cb7018d1640e\") " pod="openstack/nova-cell1-conductor-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.769111 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.772959 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.774057 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a13dbc73-75fc-448b-af44-cb7018d1640e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a13dbc73-75fc-448b-af44-cb7018d1640e\") " pod="openstack/nova-cell1-conductor-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.774898 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a13dbc73-75fc-448b-af44-cb7018d1640e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a13dbc73-75fc-448b-af44-cb7018d1640e\") " pod="openstack/nova-cell1-conductor-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.778866 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.789443 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbzvp\" (UniqueName: \"kubernetes.io/projected/a13dbc73-75fc-448b-af44-cb7018d1640e-kube-api-access-nbzvp\") pod \"nova-cell1-conductor-0\" (UID: \"a13dbc73-75fc-448b-af44-cb7018d1640e\") " pod="openstack/nova-cell1-conductor-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.870260 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-logs\") pod \"nova-api-0\" (UID: \"2d46aa4d-a4d9-4376-8c8f-2dee489f4662\") " pod="openstack/nova-api-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.870549 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-config-data\") pod \"nova-api-0\" (UID: \"2d46aa4d-a4d9-4376-8c8f-2dee489f4662\") " pod="openstack/nova-api-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.870712 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmcgw\" (UniqueName: \"kubernetes.io/projected/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-kube-api-access-nmcgw\") pod \"nova-api-0\" (UID: \"2d46aa4d-a4d9-4376-8c8f-2dee489f4662\") " pod="openstack/nova-api-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.870821 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2d46aa4d-a4d9-4376-8c8f-2dee489f4662\") " pod="openstack/nova-api-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.883050 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.990168 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-config-data\") pod \"nova-api-0\" (UID: \"2d46aa4d-a4d9-4376-8c8f-2dee489f4662\") " pod="openstack/nova-api-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.990597 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2d46aa4d-a4d9-4376-8c8f-2dee489f4662\") " pod="openstack/nova-api-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.990624 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmcgw\" (UniqueName: \"kubernetes.io/projected/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-kube-api-access-nmcgw\") pod \"nova-api-0\" (UID: \"2d46aa4d-a4d9-4376-8c8f-2dee489f4662\") " pod="openstack/nova-api-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.990888 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-logs\") pod \"nova-api-0\" (UID: \"2d46aa4d-a4d9-4376-8c8f-2dee489f4662\") " pod="openstack/nova-api-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.991659 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-logs\") pod \"nova-api-0\" (UID: \"2d46aa4d-a4d9-4376-8c8f-2dee489f4662\") " pod="openstack/nova-api-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.998724 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-config-data\") pod \"nova-api-0\" (UID: \"2d46aa4d-a4d9-4376-8c8f-2dee489f4662\") " pod="openstack/nova-api-0" Feb 17 13:48:52 crc kubenswrapper[4804]: I0217 13:48:52.999010 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2d46aa4d-a4d9-4376-8c8f-2dee489f4662\") " pod="openstack/nova-api-0" Feb 17 13:48:53 crc kubenswrapper[4804]: I0217 13:48:53.021931 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmcgw\" (UniqueName: \"kubernetes.io/projected/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-kube-api-access-nmcgw\") pod \"nova-api-0\" (UID: \"2d46aa4d-a4d9-4376-8c8f-2dee489f4662\") " pod="openstack/nova-api-0" Feb 17 13:48:53 crc kubenswrapper[4804]: I0217 13:48:53.100147 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 13:48:53 crc kubenswrapper[4804]: I0217 13:48:53.333943 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 13:48:53 crc kubenswrapper[4804]: W0217 13:48:53.339595 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda13dbc73_75fc_448b_af44_cb7018d1640e.slice/crio-eddd1c07b9438ffa6bba5a0556b4300201c1c61622174476a6397c2ad1987c61 WatchSource:0}: Error finding container eddd1c07b9438ffa6bba5a0556b4300201c1c61622174476a6397c2ad1987c61: Status 404 returned error can't find the container with id eddd1c07b9438ffa6bba5a0556b4300201c1c61622174476a6397c2ad1987c61 Feb 17 13:48:53 crc kubenswrapper[4804]: I0217 13:48:53.421461 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a13dbc73-75fc-448b-af44-cb7018d1640e","Type":"ContainerStarted","Data":"eddd1c07b9438ffa6bba5a0556b4300201c1c61622174476a6397c2ad1987c61"} Feb 17 13:48:53 crc kubenswrapper[4804]: I0217 13:48:53.547173 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 13:48:53 crc kubenswrapper[4804]: W0217 13:48:53.551282 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d46aa4d_a4d9_4376_8c8f_2dee489f4662.slice/crio-d1232b5d92c13480d32625bfcaa956d7a1646000084566fa3ede73979577f667 WatchSource:0}: Error finding container d1232b5d92c13480d32625bfcaa956d7a1646000084566fa3ede73979577f667: Status 404 returned error can't find the container with id d1232b5d92c13480d32625bfcaa956d7a1646000084566fa3ede73979577f667 Feb 17 13:48:53 crc kubenswrapper[4804]: I0217 13:48:53.737612 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 13:48:53 crc kubenswrapper[4804]: I0217 13:48:53.737679 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 13:48:54 crc kubenswrapper[4804]: I0217 13:48:54.433580 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2d46aa4d-a4d9-4376-8c8f-2dee489f4662","Type":"ContainerStarted","Data":"1eb83a06c1d6b3047f8b48dc4249bd9299bd8e7635ba1278b431336f1af1d8e0"} Feb 17 13:48:54 crc kubenswrapper[4804]: I0217 13:48:54.433974 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2d46aa4d-a4d9-4376-8c8f-2dee489f4662","Type":"ContainerStarted","Data":"e7380779b9b488b67f4196e0c790ee76a2f4da6daa1378da1dcee7f63e530bfd"} Feb 17 13:48:54 crc kubenswrapper[4804]: I0217 13:48:54.433987 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2d46aa4d-a4d9-4376-8c8f-2dee489f4662","Type":"ContainerStarted","Data":"d1232b5d92c13480d32625bfcaa956d7a1646000084566fa3ede73979577f667"} Feb 17 13:48:54 crc kubenswrapper[4804]: I0217 13:48:54.438456 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a13dbc73-75fc-448b-af44-cb7018d1640e","Type":"ContainerStarted","Data":"839843a1103d6f617dc26c0fc61f8789035e22903b26a0675932659651d3a249"} Feb 17 13:48:54 crc kubenswrapper[4804]: I0217 13:48:54.438600 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 17 13:48:54 crc kubenswrapper[4804]: I0217 13:48:54.461417 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.4613965110000002 podStartE2EDuration="2.461396511s" podCreationTimestamp="2026-02-17 13:48:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:48:54.456085853 +0000 UTC m=+1408.567505200" watchObservedRunningTime="2026-02-17 13:48:54.461396511 +0000 UTC m=+1408.572815858" Feb 17 13:48:54 crc kubenswrapper[4804]: I0217 13:48:54.480815 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.48079674 podStartE2EDuration="2.48079674s" podCreationTimestamp="2026-02-17 13:48:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:48:54.474241654 +0000 UTC m=+1408.585660991" watchObservedRunningTime="2026-02-17 13:48:54.48079674 +0000 UTC m=+1408.592216077" Feb 17 13:48:54 crc kubenswrapper[4804]: I0217 13:48:54.587186 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80e4a011-e72b-4fea-b6cb-15425d5d5940" path="/var/lib/kubelet/pods/80e4a011-e72b-4fea-b6cb-15425d5d5940/volumes" Feb 17 13:48:55 crc kubenswrapper[4804]: I0217 13:48:55.793654 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 17 13:48:55 crc kubenswrapper[4804]: I0217 13:48:55.835468 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:48:55 crc kubenswrapper[4804]: I0217 13:48:55.835537 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:48:58 crc kubenswrapper[4804]: I0217 13:48:58.736398 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 17 13:48:58 crc kubenswrapper[4804]: I0217 13:48:58.736678 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 17 13:48:59 crc kubenswrapper[4804]: I0217 13:48:59.749384 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="aa87191a-671d-43c8-b8c2-e5e07a54af02" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 13:48:59 crc kubenswrapper[4804]: I0217 13:48:59.749442 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="aa87191a-671d-43c8-b8c2-e5e07a54af02" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 13:49:00 crc kubenswrapper[4804]: I0217 13:49:00.377907 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 17 13:49:00 crc kubenswrapper[4804]: I0217 13:49:00.794471 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 17 13:49:00 crc kubenswrapper[4804]: I0217 13:49:00.827039 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 17 13:49:01 crc kubenswrapper[4804]: I0217 13:49:01.562080 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 17 13:49:02 crc kubenswrapper[4804]: I0217 13:49:02.918929 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 17 13:49:03 crc kubenswrapper[4804]: I0217 13:49:03.100994 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 13:49:03 crc kubenswrapper[4804]: I0217 13:49:03.101308 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 13:49:03 crc kubenswrapper[4804]: I0217 13:49:03.865867 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 13:49:03 crc kubenswrapper[4804]: I0217 13:49:03.866088 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="cae6d84c-f65f-4ab2-a733-424ea34c680d" containerName="kube-state-metrics" containerID="cri-o://5a806d3dffc413a5deaa4f1d154f348db2fd3905e9efed500b9b4abe1bb8cfa0" gracePeriod=30 Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.186812 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2d46aa4d-a4d9-4376-8c8f-2dee489f4662" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.187348 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2d46aa4d-a4d9-4376-8c8f-2dee489f4662" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.387267 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.506403 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncwbr\" (UniqueName: \"kubernetes.io/projected/cae6d84c-f65f-4ab2-a733-424ea34c680d-kube-api-access-ncwbr\") pod \"cae6d84c-f65f-4ab2-a733-424ea34c680d\" (UID: \"cae6d84c-f65f-4ab2-a733-424ea34c680d\") " Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.512933 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cae6d84c-f65f-4ab2-a733-424ea34c680d-kube-api-access-ncwbr" (OuterVolumeSpecName: "kube-api-access-ncwbr") pod "cae6d84c-f65f-4ab2-a733-424ea34c680d" (UID: "cae6d84c-f65f-4ab2-a733-424ea34c680d"). InnerVolumeSpecName "kube-api-access-ncwbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.596110 4804 generic.go:334] "Generic (PLEG): container finished" podID="cae6d84c-f65f-4ab2-a733-424ea34c680d" containerID="5a806d3dffc413a5deaa4f1d154f348db2fd3905e9efed500b9b4abe1bb8cfa0" exitCode=2 Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.596161 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cae6d84c-f65f-4ab2-a733-424ea34c680d","Type":"ContainerDied","Data":"5a806d3dffc413a5deaa4f1d154f348db2fd3905e9efed500b9b4abe1bb8cfa0"} Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.596186 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cae6d84c-f65f-4ab2-a733-424ea34c680d","Type":"ContainerDied","Data":"6af0a26e9132d4c61e6cb494719994825c6ff8368e85c8ef8c51fa4c2767ffd0"} Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.596224 4804 scope.go:117] "RemoveContainer" containerID="5a806d3dffc413a5deaa4f1d154f348db2fd3905e9efed500b9b4abe1bb8cfa0" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.596342 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.608102 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncwbr\" (UniqueName: \"kubernetes.io/projected/cae6d84c-f65f-4ab2-a733-424ea34c680d-kube-api-access-ncwbr\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.681512 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.696045 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.700081 4804 scope.go:117] "RemoveContainer" containerID="5a806d3dffc413a5deaa4f1d154f348db2fd3905e9efed500b9b4abe1bb8cfa0" Feb 17 13:49:04 crc kubenswrapper[4804]: E0217 13:49:04.709363 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a806d3dffc413a5deaa4f1d154f348db2fd3905e9efed500b9b4abe1bb8cfa0\": container with ID starting with 5a806d3dffc413a5deaa4f1d154f348db2fd3905e9efed500b9b4abe1bb8cfa0 not found: ID does not exist" containerID="5a806d3dffc413a5deaa4f1d154f348db2fd3905e9efed500b9b4abe1bb8cfa0" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.709426 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a806d3dffc413a5deaa4f1d154f348db2fd3905e9efed500b9b4abe1bb8cfa0"} err="failed to get container status \"5a806d3dffc413a5deaa4f1d154f348db2fd3905e9efed500b9b4abe1bb8cfa0\": rpc error: code = NotFound desc = could not find container \"5a806d3dffc413a5deaa4f1d154f348db2fd3905e9efed500b9b4abe1bb8cfa0\": container with ID starting with 5a806d3dffc413a5deaa4f1d154f348db2fd3905e9efed500b9b4abe1bb8cfa0 not found: ID does not exist" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.728272 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 13:49:04 crc kubenswrapper[4804]: E0217 13:49:04.728764 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cae6d84c-f65f-4ab2-a733-424ea34c680d" containerName="kube-state-metrics" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.728781 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="cae6d84c-f65f-4ab2-a733-424ea34c680d" containerName="kube-state-metrics" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.728993 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="cae6d84c-f65f-4ab2-a733-424ea34c680d" containerName="kube-state-metrics" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.729712 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.748178 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.748338 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.761599 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.811327 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmmz5\" (UniqueName: \"kubernetes.io/projected/d6aabf20-b0bf-4f35-aec7-098f38bacfd9-kube-api-access-jmmz5\") pod \"kube-state-metrics-0\" (UID: \"d6aabf20-b0bf-4f35-aec7-098f38bacfd9\") " pod="openstack/kube-state-metrics-0" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.811441 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6aabf20-b0bf-4f35-aec7-098f38bacfd9-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"d6aabf20-b0bf-4f35-aec7-098f38bacfd9\") " pod="openstack/kube-state-metrics-0" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.811487 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d6aabf20-b0bf-4f35-aec7-098f38bacfd9-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"d6aabf20-b0bf-4f35-aec7-098f38bacfd9\") " pod="openstack/kube-state-metrics-0" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.811516 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6aabf20-b0bf-4f35-aec7-098f38bacfd9-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"d6aabf20-b0bf-4f35-aec7-098f38bacfd9\") " pod="openstack/kube-state-metrics-0" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.912706 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmmz5\" (UniqueName: \"kubernetes.io/projected/d6aabf20-b0bf-4f35-aec7-098f38bacfd9-kube-api-access-jmmz5\") pod \"kube-state-metrics-0\" (UID: \"d6aabf20-b0bf-4f35-aec7-098f38bacfd9\") " pod="openstack/kube-state-metrics-0" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.912782 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6aabf20-b0bf-4f35-aec7-098f38bacfd9-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"d6aabf20-b0bf-4f35-aec7-098f38bacfd9\") " pod="openstack/kube-state-metrics-0" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.912814 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d6aabf20-b0bf-4f35-aec7-098f38bacfd9-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"d6aabf20-b0bf-4f35-aec7-098f38bacfd9\") " pod="openstack/kube-state-metrics-0" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.912836 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6aabf20-b0bf-4f35-aec7-098f38bacfd9-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"d6aabf20-b0bf-4f35-aec7-098f38bacfd9\") " pod="openstack/kube-state-metrics-0" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.918486 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d6aabf20-b0bf-4f35-aec7-098f38bacfd9-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"d6aabf20-b0bf-4f35-aec7-098f38bacfd9\") " pod="openstack/kube-state-metrics-0" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.918588 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6aabf20-b0bf-4f35-aec7-098f38bacfd9-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"d6aabf20-b0bf-4f35-aec7-098f38bacfd9\") " pod="openstack/kube-state-metrics-0" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.921559 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6aabf20-b0bf-4f35-aec7-098f38bacfd9-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"d6aabf20-b0bf-4f35-aec7-098f38bacfd9\") " pod="openstack/kube-state-metrics-0" Feb 17 13:49:04 crc kubenswrapper[4804]: I0217 13:49:04.932999 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmmz5\" (UniqueName: \"kubernetes.io/projected/d6aabf20-b0bf-4f35-aec7-098f38bacfd9-kube-api-access-jmmz5\") pod \"kube-state-metrics-0\" (UID: \"d6aabf20-b0bf-4f35-aec7-098f38bacfd9\") " pod="openstack/kube-state-metrics-0" Feb 17 13:49:05 crc kubenswrapper[4804]: I0217 13:49:05.091442 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 13:49:05 crc kubenswrapper[4804]: I0217 13:49:05.573848 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 13:49:05 crc kubenswrapper[4804]: W0217 13:49:05.577167 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6aabf20_b0bf_4f35_aec7_098f38bacfd9.slice/crio-8dec676dd6a90d7dfb73a1108ca6ea866b7b2f1642539f4552196380d13d9358 WatchSource:0}: Error finding container 8dec676dd6a90d7dfb73a1108ca6ea866b7b2f1642539f4552196380d13d9358: Status 404 returned error can't find the container with id 8dec676dd6a90d7dfb73a1108ca6ea866b7b2f1642539f4552196380d13d9358 Feb 17 13:49:05 crc kubenswrapper[4804]: I0217 13:49:05.605549 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d6aabf20-b0bf-4f35-aec7-098f38bacfd9","Type":"ContainerStarted","Data":"8dec676dd6a90d7dfb73a1108ca6ea866b7b2f1642539f4552196380d13d9358"} Feb 17 13:49:05 crc kubenswrapper[4804]: I0217 13:49:05.839995 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:49:05 crc kubenswrapper[4804]: I0217 13:49:05.840647 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0e6284b7-c2bf-491d-a8b8-66390efc3657" containerName="ceilometer-central-agent" containerID="cri-o://7ec1c31be2fad580ed205d1e9c083f2d309329aa6efc4e37d932dccebcd4de37" gracePeriod=30 Feb 17 13:49:05 crc kubenswrapper[4804]: I0217 13:49:05.840780 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0e6284b7-c2bf-491d-a8b8-66390efc3657" containerName="proxy-httpd" containerID="cri-o://739d82db51200a6aabaa651d7d7ac8ef8cc374993828b81c66981bd68d01b0b1" gracePeriod=30 Feb 17 13:49:05 crc kubenswrapper[4804]: I0217 13:49:05.840869 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0e6284b7-c2bf-491d-a8b8-66390efc3657" containerName="ceilometer-notification-agent" containerID="cri-o://c4c58d3303344b37b7dc7ae819f9a51ca95e884e972546633bf78895d550d607" gracePeriod=30 Feb 17 13:49:05 crc kubenswrapper[4804]: I0217 13:49:05.840851 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0e6284b7-c2bf-491d-a8b8-66390efc3657" containerName="sg-core" containerID="cri-o://626e902362b6353811b54bed35993141950fc6ad10f5cc8f6cea6b4259262e10" gracePeriod=30 Feb 17 13:49:06 crc kubenswrapper[4804]: I0217 13:49:06.585796 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cae6d84c-f65f-4ab2-a733-424ea34c680d" path="/var/lib/kubelet/pods/cae6d84c-f65f-4ab2-a733-424ea34c680d/volumes" Feb 17 13:49:06 crc kubenswrapper[4804]: I0217 13:49:06.617733 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d6aabf20-b0bf-4f35-aec7-098f38bacfd9","Type":"ContainerStarted","Data":"58b9835c32417f29c65635ceb7ce6b84e66c392dc8d4953534bc30e129934091"} Feb 17 13:49:06 crc kubenswrapper[4804]: I0217 13:49:06.618619 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 17 13:49:06 crc kubenswrapper[4804]: I0217 13:49:06.621160 4804 generic.go:334] "Generic (PLEG): container finished" podID="0e6284b7-c2bf-491d-a8b8-66390efc3657" containerID="739d82db51200a6aabaa651d7d7ac8ef8cc374993828b81c66981bd68d01b0b1" exitCode=0 Feb 17 13:49:06 crc kubenswrapper[4804]: I0217 13:49:06.621191 4804 generic.go:334] "Generic (PLEG): container finished" podID="0e6284b7-c2bf-491d-a8b8-66390efc3657" containerID="626e902362b6353811b54bed35993141950fc6ad10f5cc8f6cea6b4259262e10" exitCode=2 Feb 17 13:49:06 crc kubenswrapper[4804]: I0217 13:49:06.621221 4804 generic.go:334] "Generic (PLEG): container finished" podID="0e6284b7-c2bf-491d-a8b8-66390efc3657" containerID="7ec1c31be2fad580ed205d1e9c083f2d309329aa6efc4e37d932dccebcd4de37" exitCode=0 Feb 17 13:49:06 crc kubenswrapper[4804]: I0217 13:49:06.621247 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e6284b7-c2bf-491d-a8b8-66390efc3657","Type":"ContainerDied","Data":"739d82db51200a6aabaa651d7d7ac8ef8cc374993828b81c66981bd68d01b0b1"} Feb 17 13:49:06 crc kubenswrapper[4804]: I0217 13:49:06.621271 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e6284b7-c2bf-491d-a8b8-66390efc3657","Type":"ContainerDied","Data":"626e902362b6353811b54bed35993141950fc6ad10f5cc8f6cea6b4259262e10"} Feb 17 13:49:06 crc kubenswrapper[4804]: I0217 13:49:06.621285 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e6284b7-c2bf-491d-a8b8-66390efc3657","Type":"ContainerDied","Data":"7ec1c31be2fad580ed205d1e9c083f2d309329aa6efc4e37d932dccebcd4de37"} Feb 17 13:49:06 crc kubenswrapper[4804]: I0217 13:49:06.643898 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.237444754 podStartE2EDuration="2.643868619s" podCreationTimestamp="2026-02-17 13:49:04 +0000 UTC" firstStartedPulling="2026-02-17 13:49:05.579246504 +0000 UTC m=+1419.690665841" lastFinishedPulling="2026-02-17 13:49:05.985670369 +0000 UTC m=+1420.097089706" observedRunningTime="2026-02-17 13:49:06.633765592 +0000 UTC m=+1420.745184929" watchObservedRunningTime="2026-02-17 13:49:06.643868619 +0000 UTC m=+1420.755287956" Feb 17 13:49:08 crc kubenswrapper[4804]: I0217 13:49:08.742392 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 17 13:49:08 crc kubenswrapper[4804]: I0217 13:49:08.747055 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 17 13:49:08 crc kubenswrapper[4804]: I0217 13:49:08.752822 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 17 13:49:09 crc kubenswrapper[4804]: I0217 13:49:09.665720 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.094005 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.206823 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-config-data\") pod \"0e6284b7-c2bf-491d-a8b8-66390efc3657\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.206880 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e6284b7-c2bf-491d-a8b8-66390efc3657-run-httpd\") pod \"0e6284b7-c2bf-491d-a8b8-66390efc3657\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.206948 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-combined-ca-bundle\") pod \"0e6284b7-c2bf-491d-a8b8-66390efc3657\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.206988 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5l9t6\" (UniqueName: \"kubernetes.io/projected/0e6284b7-c2bf-491d-a8b8-66390efc3657-kube-api-access-5l9t6\") pod \"0e6284b7-c2bf-491d-a8b8-66390efc3657\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.207055 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e6284b7-c2bf-491d-a8b8-66390efc3657-log-httpd\") pod \"0e6284b7-c2bf-491d-a8b8-66390efc3657\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.207095 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-sg-core-conf-yaml\") pod \"0e6284b7-c2bf-491d-a8b8-66390efc3657\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.207189 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-scripts\") pod \"0e6284b7-c2bf-491d-a8b8-66390efc3657\" (UID: \"0e6284b7-c2bf-491d-a8b8-66390efc3657\") " Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.208681 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e6284b7-c2bf-491d-a8b8-66390efc3657-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0e6284b7-c2bf-491d-a8b8-66390efc3657" (UID: "0e6284b7-c2bf-491d-a8b8-66390efc3657"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.208756 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e6284b7-c2bf-491d-a8b8-66390efc3657-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0e6284b7-c2bf-491d-a8b8-66390efc3657" (UID: "0e6284b7-c2bf-491d-a8b8-66390efc3657"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.214681 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-scripts" (OuterVolumeSpecName: "scripts") pod "0e6284b7-c2bf-491d-a8b8-66390efc3657" (UID: "0e6284b7-c2bf-491d-a8b8-66390efc3657"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.234443 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0e6284b7-c2bf-491d-a8b8-66390efc3657" (UID: "0e6284b7-c2bf-491d-a8b8-66390efc3657"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.238371 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e6284b7-c2bf-491d-a8b8-66390efc3657-kube-api-access-5l9t6" (OuterVolumeSpecName: "kube-api-access-5l9t6") pod "0e6284b7-c2bf-491d-a8b8-66390efc3657" (UID: "0e6284b7-c2bf-491d-a8b8-66390efc3657"). InnerVolumeSpecName "kube-api-access-5l9t6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.310361 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.310388 4804 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e6284b7-c2bf-491d-a8b8-66390efc3657-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.310398 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5l9t6\" (UniqueName: \"kubernetes.io/projected/0e6284b7-c2bf-491d-a8b8-66390efc3657-kube-api-access-5l9t6\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.310444 4804 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e6284b7-c2bf-491d-a8b8-66390efc3657-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.310452 4804 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.312497 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e6284b7-c2bf-491d-a8b8-66390efc3657" (UID: "0e6284b7-c2bf-491d-a8b8-66390efc3657"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.327799 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-config-data" (OuterVolumeSpecName: "config-data") pod "0e6284b7-c2bf-491d-a8b8-66390efc3657" (UID: "0e6284b7-c2bf-491d-a8b8-66390efc3657"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.412259 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.412290 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e6284b7-c2bf-491d-a8b8-66390efc3657-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.668811 4804 generic.go:334] "Generic (PLEG): container finished" podID="0e6284b7-c2bf-491d-a8b8-66390efc3657" containerID="c4c58d3303344b37b7dc7ae819f9a51ca95e884e972546633bf78895d550d607" exitCode=0 Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.668865 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e6284b7-c2bf-491d-a8b8-66390efc3657","Type":"ContainerDied","Data":"c4c58d3303344b37b7dc7ae819f9a51ca95e884e972546633bf78895d550d607"} Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.668916 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e6284b7-c2bf-491d-a8b8-66390efc3657","Type":"ContainerDied","Data":"2d2e5d5016d0e7547bab751744c5123f906da3f81613bad821f32b24482acee8"} Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.668919 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.668933 4804 scope.go:117] "RemoveContainer" containerID="739d82db51200a6aabaa651d7d7ac8ef8cc374993828b81c66981bd68d01b0b1" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.691165 4804 scope.go:117] "RemoveContainer" containerID="626e902362b6353811b54bed35993141950fc6ad10f5cc8f6cea6b4259262e10" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.694179 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.705240 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.713006 4804 scope.go:117] "RemoveContainer" containerID="c4c58d3303344b37b7dc7ae819f9a51ca95e884e972546633bf78895d550d607" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.723125 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:49:10 crc kubenswrapper[4804]: E0217 13:49:10.723578 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e6284b7-c2bf-491d-a8b8-66390efc3657" containerName="sg-core" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.723598 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e6284b7-c2bf-491d-a8b8-66390efc3657" containerName="sg-core" Feb 17 13:49:10 crc kubenswrapper[4804]: E0217 13:49:10.723610 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e6284b7-c2bf-491d-a8b8-66390efc3657" containerName="proxy-httpd" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.723616 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e6284b7-c2bf-491d-a8b8-66390efc3657" containerName="proxy-httpd" Feb 17 13:49:10 crc kubenswrapper[4804]: E0217 13:49:10.723636 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e6284b7-c2bf-491d-a8b8-66390efc3657" containerName="ceilometer-notification-agent" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.723642 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e6284b7-c2bf-491d-a8b8-66390efc3657" containerName="ceilometer-notification-agent" Feb 17 13:49:10 crc kubenswrapper[4804]: E0217 13:49:10.723653 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e6284b7-c2bf-491d-a8b8-66390efc3657" containerName="ceilometer-central-agent" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.723658 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e6284b7-c2bf-491d-a8b8-66390efc3657" containerName="ceilometer-central-agent" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.723837 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e6284b7-c2bf-491d-a8b8-66390efc3657" containerName="ceilometer-central-agent" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.723854 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e6284b7-c2bf-491d-a8b8-66390efc3657" containerName="sg-core" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.723870 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e6284b7-c2bf-491d-a8b8-66390efc3657" containerName="ceilometer-notification-agent" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.723881 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e6284b7-c2bf-491d-a8b8-66390efc3657" containerName="proxy-httpd" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.725550 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.732281 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.732555 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.732869 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.734878 4804 scope.go:117] "RemoveContainer" containerID="7ec1c31be2fad580ed205d1e9c083f2d309329aa6efc4e37d932dccebcd4de37" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.746339 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.822358 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.822408 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.822586 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8813e275-f23d-497a-a085-0ac6e26ab8c0-log-httpd\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.822622 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-scripts\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.822640 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8813e275-f23d-497a-a085-0ac6e26ab8c0-run-httpd\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.822683 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.822937 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-config-data\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.822997 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24pj6\" (UniqueName: \"kubernetes.io/projected/8813e275-f23d-497a-a085-0ac6e26ab8c0-kube-api-access-24pj6\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.856368 4804 scope.go:117] "RemoveContainer" containerID="739d82db51200a6aabaa651d7d7ac8ef8cc374993828b81c66981bd68d01b0b1" Feb 17 13:49:10 crc kubenswrapper[4804]: E0217 13:49:10.857286 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"739d82db51200a6aabaa651d7d7ac8ef8cc374993828b81c66981bd68d01b0b1\": container with ID starting with 739d82db51200a6aabaa651d7d7ac8ef8cc374993828b81c66981bd68d01b0b1 not found: ID does not exist" containerID="739d82db51200a6aabaa651d7d7ac8ef8cc374993828b81c66981bd68d01b0b1" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.857348 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"739d82db51200a6aabaa651d7d7ac8ef8cc374993828b81c66981bd68d01b0b1"} err="failed to get container status \"739d82db51200a6aabaa651d7d7ac8ef8cc374993828b81c66981bd68d01b0b1\": rpc error: code = NotFound desc = could not find container \"739d82db51200a6aabaa651d7d7ac8ef8cc374993828b81c66981bd68d01b0b1\": container with ID starting with 739d82db51200a6aabaa651d7d7ac8ef8cc374993828b81c66981bd68d01b0b1 not found: ID does not exist" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.857379 4804 scope.go:117] "RemoveContainer" containerID="626e902362b6353811b54bed35993141950fc6ad10f5cc8f6cea6b4259262e10" Feb 17 13:49:10 crc kubenswrapper[4804]: E0217 13:49:10.857763 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"626e902362b6353811b54bed35993141950fc6ad10f5cc8f6cea6b4259262e10\": container with ID starting with 626e902362b6353811b54bed35993141950fc6ad10f5cc8f6cea6b4259262e10 not found: ID does not exist" containerID="626e902362b6353811b54bed35993141950fc6ad10f5cc8f6cea6b4259262e10" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.857787 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"626e902362b6353811b54bed35993141950fc6ad10f5cc8f6cea6b4259262e10"} err="failed to get container status \"626e902362b6353811b54bed35993141950fc6ad10f5cc8f6cea6b4259262e10\": rpc error: code = NotFound desc = could not find container \"626e902362b6353811b54bed35993141950fc6ad10f5cc8f6cea6b4259262e10\": container with ID starting with 626e902362b6353811b54bed35993141950fc6ad10f5cc8f6cea6b4259262e10 not found: ID does not exist" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.857804 4804 scope.go:117] "RemoveContainer" containerID="c4c58d3303344b37b7dc7ae819f9a51ca95e884e972546633bf78895d550d607" Feb 17 13:49:10 crc kubenswrapper[4804]: E0217 13:49:10.858063 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4c58d3303344b37b7dc7ae819f9a51ca95e884e972546633bf78895d550d607\": container with ID starting with c4c58d3303344b37b7dc7ae819f9a51ca95e884e972546633bf78895d550d607 not found: ID does not exist" containerID="c4c58d3303344b37b7dc7ae819f9a51ca95e884e972546633bf78895d550d607" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.858088 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4c58d3303344b37b7dc7ae819f9a51ca95e884e972546633bf78895d550d607"} err="failed to get container status \"c4c58d3303344b37b7dc7ae819f9a51ca95e884e972546633bf78895d550d607\": rpc error: code = NotFound desc = could not find container \"c4c58d3303344b37b7dc7ae819f9a51ca95e884e972546633bf78895d550d607\": container with ID starting with c4c58d3303344b37b7dc7ae819f9a51ca95e884e972546633bf78895d550d607 not found: ID does not exist" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.858105 4804 scope.go:117] "RemoveContainer" containerID="7ec1c31be2fad580ed205d1e9c083f2d309329aa6efc4e37d932dccebcd4de37" Feb 17 13:49:10 crc kubenswrapper[4804]: E0217 13:49:10.858320 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ec1c31be2fad580ed205d1e9c083f2d309329aa6efc4e37d932dccebcd4de37\": container with ID starting with 7ec1c31be2fad580ed205d1e9c083f2d309329aa6efc4e37d932dccebcd4de37 not found: ID does not exist" containerID="7ec1c31be2fad580ed205d1e9c083f2d309329aa6efc4e37d932dccebcd4de37" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.858346 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ec1c31be2fad580ed205d1e9c083f2d309329aa6efc4e37d932dccebcd4de37"} err="failed to get container status \"7ec1c31be2fad580ed205d1e9c083f2d309329aa6efc4e37d932dccebcd4de37\": rpc error: code = NotFound desc = could not find container \"7ec1c31be2fad580ed205d1e9c083f2d309329aa6efc4e37d932dccebcd4de37\": container with ID starting with 7ec1c31be2fad580ed205d1e9c083f2d309329aa6efc4e37d932dccebcd4de37 not found: ID does not exist" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.925160 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-config-data\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.925243 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24pj6\" (UniqueName: \"kubernetes.io/projected/8813e275-f23d-497a-a085-0ac6e26ab8c0-kube-api-access-24pj6\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.925321 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.925345 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.925421 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8813e275-f23d-497a-a085-0ac6e26ab8c0-log-httpd\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.925454 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-scripts\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.925473 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8813e275-f23d-497a-a085-0ac6e26ab8c0-run-httpd\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.925508 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.926483 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8813e275-f23d-497a-a085-0ac6e26ab8c0-log-httpd\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.926672 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8813e275-f23d-497a-a085-0ac6e26ab8c0-run-httpd\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.932118 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.932271 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-scripts\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.932997 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.937389 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-config-data\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.938184 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:10 crc kubenswrapper[4804]: I0217 13:49:10.943156 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24pj6\" (UniqueName: \"kubernetes.io/projected/8813e275-f23d-497a-a085-0ac6e26ab8c0-kube-api-access-24pj6\") pod \"ceilometer-0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " pod="openstack/ceilometer-0" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.154785 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.317797 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.435423 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeb819ef-7656-4054-baa2-02efb705872d-combined-ca-bundle\") pod \"aeb819ef-7656-4054-baa2-02efb705872d\" (UID: \"aeb819ef-7656-4054-baa2-02efb705872d\") " Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.435525 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8n6q\" (UniqueName: \"kubernetes.io/projected/aeb819ef-7656-4054-baa2-02efb705872d-kube-api-access-b8n6q\") pod \"aeb819ef-7656-4054-baa2-02efb705872d\" (UID: \"aeb819ef-7656-4054-baa2-02efb705872d\") " Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.435710 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeb819ef-7656-4054-baa2-02efb705872d-config-data\") pod \"aeb819ef-7656-4054-baa2-02efb705872d\" (UID: \"aeb819ef-7656-4054-baa2-02efb705872d\") " Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.440191 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeb819ef-7656-4054-baa2-02efb705872d-kube-api-access-b8n6q" (OuterVolumeSpecName: "kube-api-access-b8n6q") pod "aeb819ef-7656-4054-baa2-02efb705872d" (UID: "aeb819ef-7656-4054-baa2-02efb705872d"). InnerVolumeSpecName "kube-api-access-b8n6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.465464 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeb819ef-7656-4054-baa2-02efb705872d-config-data" (OuterVolumeSpecName: "config-data") pod "aeb819ef-7656-4054-baa2-02efb705872d" (UID: "aeb819ef-7656-4054-baa2-02efb705872d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.465742 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeb819ef-7656-4054-baa2-02efb705872d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aeb819ef-7656-4054-baa2-02efb705872d" (UID: "aeb819ef-7656-4054-baa2-02efb705872d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.538267 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeb819ef-7656-4054-baa2-02efb705872d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.538311 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8n6q\" (UniqueName: \"kubernetes.io/projected/aeb819ef-7656-4054-baa2-02efb705872d-kube-api-access-b8n6q\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.538323 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeb819ef-7656-4054-baa2-02efb705872d-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:11 crc kubenswrapper[4804]: W0217 13:49:11.662726 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8813e275_f23d_497a_a085_0ac6e26ab8c0.slice/crio-4f38f1401a13eb1042fdd0203f8e9dd2f66fd959325271d3b392c37b7cce4024 WatchSource:0}: Error finding container 4f38f1401a13eb1042fdd0203f8e9dd2f66fd959325271d3b392c37b7cce4024: Status 404 returned error can't find the container with id 4f38f1401a13eb1042fdd0203f8e9dd2f66fd959325271d3b392c37b7cce4024 Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.664664 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.678543 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8813e275-f23d-497a-a085-0ac6e26ab8c0","Type":"ContainerStarted","Data":"4f38f1401a13eb1042fdd0203f8e9dd2f66fd959325271d3b392c37b7cce4024"} Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.680102 4804 generic.go:334] "Generic (PLEG): container finished" podID="aeb819ef-7656-4054-baa2-02efb705872d" containerID="12eb25fbc580c4805b1380337dedc13f277483de04cb357d0874c00b946b6e87" exitCode=137 Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.680144 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"aeb819ef-7656-4054-baa2-02efb705872d","Type":"ContainerDied","Data":"12eb25fbc580c4805b1380337dedc13f277483de04cb357d0874c00b946b6e87"} Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.680160 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"aeb819ef-7656-4054-baa2-02efb705872d","Type":"ContainerDied","Data":"4ffce2ca7928c17f5cf87a7c53dec619e957c317bf58abe39325d5edeb55c199"} Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.680175 4804 scope.go:117] "RemoveContainer" containerID="12eb25fbc580c4805b1380337dedc13f277483de04cb357d0874c00b946b6e87" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.680270 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.708295 4804 scope.go:117] "RemoveContainer" containerID="12eb25fbc580c4805b1380337dedc13f277483de04cb357d0874c00b946b6e87" Feb 17 13:49:11 crc kubenswrapper[4804]: E0217 13:49:11.709546 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12eb25fbc580c4805b1380337dedc13f277483de04cb357d0874c00b946b6e87\": container with ID starting with 12eb25fbc580c4805b1380337dedc13f277483de04cb357d0874c00b946b6e87 not found: ID does not exist" containerID="12eb25fbc580c4805b1380337dedc13f277483de04cb357d0874c00b946b6e87" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.709585 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12eb25fbc580c4805b1380337dedc13f277483de04cb357d0874c00b946b6e87"} err="failed to get container status \"12eb25fbc580c4805b1380337dedc13f277483de04cb357d0874c00b946b6e87\": rpc error: code = NotFound desc = could not find container \"12eb25fbc580c4805b1380337dedc13f277483de04cb357d0874c00b946b6e87\": container with ID starting with 12eb25fbc580c4805b1380337dedc13f277483de04cb357d0874c00b946b6e87 not found: ID does not exist" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.710421 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.718603 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.731714 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 13:49:11 crc kubenswrapper[4804]: E0217 13:49:11.732163 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeb819ef-7656-4054-baa2-02efb705872d" containerName="nova-cell1-novncproxy-novncproxy" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.732183 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeb819ef-7656-4054-baa2-02efb705872d" containerName="nova-cell1-novncproxy-novncproxy" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.732416 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeb819ef-7656-4054-baa2-02efb705872d" containerName="nova-cell1-novncproxy-novncproxy" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.733089 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.737325 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.737431 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.737757 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.744277 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.842819 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6xsm\" (UniqueName: \"kubernetes.io/projected/5c380610-c164-4798-a5df-9b90fd475667-kube-api-access-m6xsm\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c380610-c164-4798-a5df-9b90fd475667\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.842965 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c380610-c164-4798-a5df-9b90fd475667-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c380610-c164-4798-a5df-9b90fd475667\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.843154 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c380610-c164-4798-a5df-9b90fd475667-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c380610-c164-4798-a5df-9b90fd475667\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.843312 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c380610-c164-4798-a5df-9b90fd475667-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c380610-c164-4798-a5df-9b90fd475667\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.843475 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c380610-c164-4798-a5df-9b90fd475667-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c380610-c164-4798-a5df-9b90fd475667\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.944767 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c380610-c164-4798-a5df-9b90fd475667-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c380610-c164-4798-a5df-9b90fd475667\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.944954 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c380610-c164-4798-a5df-9b90fd475667-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c380610-c164-4798-a5df-9b90fd475667\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.945001 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6xsm\" (UniqueName: \"kubernetes.io/projected/5c380610-c164-4798-a5df-9b90fd475667-kube-api-access-m6xsm\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c380610-c164-4798-a5df-9b90fd475667\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.945044 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c380610-c164-4798-a5df-9b90fd475667-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c380610-c164-4798-a5df-9b90fd475667\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.945097 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c380610-c164-4798-a5df-9b90fd475667-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c380610-c164-4798-a5df-9b90fd475667\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.950784 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c380610-c164-4798-a5df-9b90fd475667-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c380610-c164-4798-a5df-9b90fd475667\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.950985 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c380610-c164-4798-a5df-9b90fd475667-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c380610-c164-4798-a5df-9b90fd475667\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.951346 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c380610-c164-4798-a5df-9b90fd475667-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c380610-c164-4798-a5df-9b90fd475667\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.956982 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c380610-c164-4798-a5df-9b90fd475667-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c380610-c164-4798-a5df-9b90fd475667\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:11 crc kubenswrapper[4804]: I0217 13:49:11.963824 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6xsm\" (UniqueName: \"kubernetes.io/projected/5c380610-c164-4798-a5df-9b90fd475667-kube-api-access-m6xsm\") pod \"nova-cell1-novncproxy-0\" (UID: \"5c380610-c164-4798-a5df-9b90fd475667\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:12 crc kubenswrapper[4804]: I0217 13:49:12.057992 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:12 crc kubenswrapper[4804]: I0217 13:49:12.512016 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 13:49:12 crc kubenswrapper[4804]: I0217 13:49:12.592574 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e6284b7-c2bf-491d-a8b8-66390efc3657" path="/var/lib/kubelet/pods/0e6284b7-c2bf-491d-a8b8-66390efc3657/volumes" Feb 17 13:49:12 crc kubenswrapper[4804]: I0217 13:49:12.593589 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aeb819ef-7656-4054-baa2-02efb705872d" path="/var/lib/kubelet/pods/aeb819ef-7656-4054-baa2-02efb705872d/volumes" Feb 17 13:49:12 crc kubenswrapper[4804]: I0217 13:49:12.691277 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5c380610-c164-4798-a5df-9b90fd475667","Type":"ContainerStarted","Data":"d69c90225ba282cd52c0e4053112dc70f43cb765cbf91891dfd4fd705ac37225"} Feb 17 13:49:13 crc kubenswrapper[4804]: I0217 13:49:13.104264 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 17 13:49:13 crc kubenswrapper[4804]: I0217 13:49:13.105706 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 17 13:49:13 crc kubenswrapper[4804]: I0217 13:49:13.108921 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 17 13:49:13 crc kubenswrapper[4804]: I0217 13:49:13.113911 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 17 13:49:13 crc kubenswrapper[4804]: I0217 13:49:13.702156 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5c380610-c164-4798-a5df-9b90fd475667","Type":"ContainerStarted","Data":"3e3b56f303907257935d1c0c65df81e464e3beecf73066a6cd9b9dee8ec04501"} Feb 17 13:49:13 crc kubenswrapper[4804]: I0217 13:49:13.706007 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8813e275-f23d-497a-a085-0ac6e26ab8c0","Type":"ContainerStarted","Data":"cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855"} Feb 17 13:49:13 crc kubenswrapper[4804]: I0217 13:49:13.706076 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 17 13:49:13 crc kubenswrapper[4804]: I0217 13:49:13.712899 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 17 13:49:13 crc kubenswrapper[4804]: I0217 13:49:13.724005 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.723980262 podStartE2EDuration="2.723980262s" podCreationTimestamp="2026-02-17 13:49:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:49:13.721143893 +0000 UTC m=+1427.832563230" watchObservedRunningTime="2026-02-17 13:49:13.723980262 +0000 UTC m=+1427.835399599" Feb 17 13:49:13 crc kubenswrapper[4804]: I0217 13:49:13.904703 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-6rhsw"] Feb 17 13:49:13 crc kubenswrapper[4804]: I0217 13:49:13.906375 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:13 crc kubenswrapper[4804]: I0217 13:49:13.938400 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-6rhsw"] Feb 17 13:49:14 crc kubenswrapper[4804]: I0217 13:49:14.092472 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jbtz\" (UniqueName: \"kubernetes.io/projected/1fd51afd-ae34-4a67-bb79-a12d396968ef-kube-api-access-5jbtz\") pod \"dnsmasq-dns-cd5cbd7b9-6rhsw\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:14 crc kubenswrapper[4804]: I0217 13:49:14.092766 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-6rhsw\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:14 crc kubenswrapper[4804]: I0217 13:49:14.092906 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-6rhsw\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:14 crc kubenswrapper[4804]: I0217 13:49:14.092999 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-6rhsw\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:14 crc kubenswrapper[4804]: I0217 13:49:14.093809 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-config\") pod \"dnsmasq-dns-cd5cbd7b9-6rhsw\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:14 crc kubenswrapper[4804]: I0217 13:49:14.093936 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-6rhsw\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:14 crc kubenswrapper[4804]: I0217 13:49:14.195407 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-config\") pod \"dnsmasq-dns-cd5cbd7b9-6rhsw\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:14 crc kubenswrapper[4804]: I0217 13:49:14.195699 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-6rhsw\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:14 crc kubenswrapper[4804]: I0217 13:49:14.196238 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-config\") pod \"dnsmasq-dns-cd5cbd7b9-6rhsw\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:14 crc kubenswrapper[4804]: I0217 13:49:14.196603 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-6rhsw\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:14 crc kubenswrapper[4804]: I0217 13:49:14.197023 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jbtz\" (UniqueName: \"kubernetes.io/projected/1fd51afd-ae34-4a67-bb79-a12d396968ef-kube-api-access-5jbtz\") pod \"dnsmasq-dns-cd5cbd7b9-6rhsw\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:14 crc kubenswrapper[4804]: I0217 13:49:14.197506 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-6rhsw\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:14 crc kubenswrapper[4804]: I0217 13:49:14.198329 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-6rhsw\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:14 crc kubenswrapper[4804]: I0217 13:49:14.199365 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-6rhsw\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:14 crc kubenswrapper[4804]: I0217 13:49:14.199510 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-6rhsw\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:14 crc kubenswrapper[4804]: I0217 13:49:14.199654 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-6rhsw\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:14 crc kubenswrapper[4804]: I0217 13:49:14.200455 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-6rhsw\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:14 crc kubenswrapper[4804]: I0217 13:49:14.227536 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jbtz\" (UniqueName: \"kubernetes.io/projected/1fd51afd-ae34-4a67-bb79-a12d396968ef-kube-api-access-5jbtz\") pod \"dnsmasq-dns-cd5cbd7b9-6rhsw\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:14 crc kubenswrapper[4804]: I0217 13:49:14.232619 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:14 crc kubenswrapper[4804]: I0217 13:49:14.789120 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-6rhsw"] Feb 17 13:49:15 crc kubenswrapper[4804]: I0217 13:49:15.103111 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 17 13:49:15 crc kubenswrapper[4804]: I0217 13:49:15.724916 4804 generic.go:334] "Generic (PLEG): container finished" podID="1fd51afd-ae34-4a67-bb79-a12d396968ef" containerID="205ba7bc73ab52ae1a2e677c3dd98e1d2935baea4c419316aadd89754cd8dfba" exitCode=0 Feb 17 13:49:15 crc kubenswrapper[4804]: I0217 13:49:15.724986 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" event={"ID":"1fd51afd-ae34-4a67-bb79-a12d396968ef","Type":"ContainerDied","Data":"205ba7bc73ab52ae1a2e677c3dd98e1d2935baea4c419316aadd89754cd8dfba"} Feb 17 13:49:15 crc kubenswrapper[4804]: I0217 13:49:15.725021 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" event={"ID":"1fd51afd-ae34-4a67-bb79-a12d396968ef","Type":"ContainerStarted","Data":"a2838e3552cf9ee264c86b1e5acbfc8482d43bbc95f8a3776ff5253f31fed64a"} Feb 17 13:49:16 crc kubenswrapper[4804]: I0217 13:49:16.450304 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 13:49:16 crc kubenswrapper[4804]: I0217 13:49:16.734071 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" event={"ID":"1fd51afd-ae34-4a67-bb79-a12d396968ef","Type":"ContainerStarted","Data":"d4cba7c814c257ff7ad9adc2a7917f2afa8d6f42aecd2a0509014a2bc501130b"} Feb 17 13:49:16 crc kubenswrapper[4804]: I0217 13:49:16.734245 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:16 crc kubenswrapper[4804]: I0217 13:49:16.734374 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2d46aa4d-a4d9-4376-8c8f-2dee489f4662" containerName="nova-api-log" containerID="cri-o://e7380779b9b488b67f4196e0c790ee76a2f4da6daa1378da1dcee7f63e530bfd" gracePeriod=30 Feb 17 13:49:16 crc kubenswrapper[4804]: I0217 13:49:16.734433 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2d46aa4d-a4d9-4376-8c8f-2dee489f4662" containerName="nova-api-api" containerID="cri-o://1eb83a06c1d6b3047f8b48dc4249bd9299bd8e7635ba1278b431336f1af1d8e0" gracePeriod=30 Feb 17 13:49:16 crc kubenswrapper[4804]: I0217 13:49:16.759319 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" podStartSLOduration=3.759302102 podStartE2EDuration="3.759302102s" podCreationTimestamp="2026-02-17 13:49:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:49:16.758420504 +0000 UTC m=+1430.869839851" watchObservedRunningTime="2026-02-17 13:49:16.759302102 +0000 UTC m=+1430.870721439" Feb 17 13:49:17 crc kubenswrapper[4804]: I0217 13:49:17.058240 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:17 crc kubenswrapper[4804]: I0217 13:49:17.364768 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:49:17 crc kubenswrapper[4804]: I0217 13:49:17.748871 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8813e275-f23d-497a-a085-0ac6e26ab8c0","Type":"ContainerStarted","Data":"867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363"} Feb 17 13:49:17 crc kubenswrapper[4804]: I0217 13:49:17.750365 4804 generic.go:334] "Generic (PLEG): container finished" podID="2d46aa4d-a4d9-4376-8c8f-2dee489f4662" containerID="e7380779b9b488b67f4196e0c790ee76a2f4da6daa1378da1dcee7f63e530bfd" exitCode=143 Feb 17 13:49:17 crc kubenswrapper[4804]: I0217 13:49:17.750534 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2d46aa4d-a4d9-4376-8c8f-2dee489f4662","Type":"ContainerDied","Data":"e7380779b9b488b67f4196e0c790ee76a2f4da6daa1378da1dcee7f63e530bfd"} Feb 17 13:49:18 crc kubenswrapper[4804]: I0217 13:49:18.765331 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8813e275-f23d-497a-a085-0ac6e26ab8c0","Type":"ContainerStarted","Data":"b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec"} Feb 17 13:49:19 crc kubenswrapper[4804]: I0217 13:49:19.782289 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8813e275-f23d-497a-a085-0ac6e26ab8c0","Type":"ContainerStarted","Data":"c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac"} Feb 17 13:49:19 crc kubenswrapper[4804]: I0217 13:49:19.783356 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8813e275-f23d-497a-a085-0ac6e26ab8c0" containerName="ceilometer-central-agent" containerID="cri-o://cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855" gracePeriod=30 Feb 17 13:49:19 crc kubenswrapper[4804]: I0217 13:49:19.783728 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 13:49:19 crc kubenswrapper[4804]: I0217 13:49:19.784053 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8813e275-f23d-497a-a085-0ac6e26ab8c0" containerName="proxy-httpd" containerID="cri-o://c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac" gracePeriod=30 Feb 17 13:49:19 crc kubenswrapper[4804]: I0217 13:49:19.784283 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8813e275-f23d-497a-a085-0ac6e26ab8c0" containerName="ceilometer-notification-agent" containerID="cri-o://867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363" gracePeriod=30 Feb 17 13:49:19 crc kubenswrapper[4804]: I0217 13:49:19.784362 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8813e275-f23d-497a-a085-0ac6e26ab8c0" containerName="sg-core" containerID="cri-o://b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec" gracePeriod=30 Feb 17 13:49:19 crc kubenswrapper[4804]: I0217 13:49:19.815227 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.45422059 podStartE2EDuration="9.8152111s" podCreationTimestamp="2026-02-17 13:49:10 +0000 UTC" firstStartedPulling="2026-02-17 13:49:11.664670381 +0000 UTC m=+1425.776089718" lastFinishedPulling="2026-02-17 13:49:19.025660891 +0000 UTC m=+1433.137080228" observedRunningTime="2026-02-17 13:49:19.814067874 +0000 UTC m=+1433.925487211" watchObservedRunningTime="2026-02-17 13:49:19.8152111 +0000 UTC m=+1433.926630437" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.321191 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.413705 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-combined-ca-bundle\") pod \"2d46aa4d-a4d9-4376-8c8f-2dee489f4662\" (UID: \"2d46aa4d-a4d9-4376-8c8f-2dee489f4662\") " Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.413762 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmcgw\" (UniqueName: \"kubernetes.io/projected/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-kube-api-access-nmcgw\") pod \"2d46aa4d-a4d9-4376-8c8f-2dee489f4662\" (UID: \"2d46aa4d-a4d9-4376-8c8f-2dee489f4662\") " Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.413815 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-logs\") pod \"2d46aa4d-a4d9-4376-8c8f-2dee489f4662\" (UID: \"2d46aa4d-a4d9-4376-8c8f-2dee489f4662\") " Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.413895 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-config-data\") pod \"2d46aa4d-a4d9-4376-8c8f-2dee489f4662\" (UID: \"2d46aa4d-a4d9-4376-8c8f-2dee489f4662\") " Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.414746 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-logs" (OuterVolumeSpecName: "logs") pod "2d46aa4d-a4d9-4376-8c8f-2dee489f4662" (UID: "2d46aa4d-a4d9-4376-8c8f-2dee489f4662"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.434246 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-kube-api-access-nmcgw" (OuterVolumeSpecName: "kube-api-access-nmcgw") pod "2d46aa4d-a4d9-4376-8c8f-2dee489f4662" (UID: "2d46aa4d-a4d9-4376-8c8f-2dee489f4662"). InnerVolumeSpecName "kube-api-access-nmcgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.443486 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-config-data" (OuterVolumeSpecName: "config-data") pod "2d46aa4d-a4d9-4376-8c8f-2dee489f4662" (UID: "2d46aa4d-a4d9-4376-8c8f-2dee489f4662"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.445069 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d46aa4d-a4d9-4376-8c8f-2dee489f4662" (UID: "2d46aa4d-a4d9-4376-8c8f-2dee489f4662"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.489272 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.516401 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.516440 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmcgw\" (UniqueName: \"kubernetes.io/projected/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-kube-api-access-nmcgw\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.516454 4804 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-logs\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.516467 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d46aa4d-a4d9-4376-8c8f-2dee489f4662-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.617607 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24pj6\" (UniqueName: \"kubernetes.io/projected/8813e275-f23d-497a-a085-0ac6e26ab8c0-kube-api-access-24pj6\") pod \"8813e275-f23d-497a-a085-0ac6e26ab8c0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.617679 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-combined-ca-bundle\") pod \"8813e275-f23d-497a-a085-0ac6e26ab8c0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.617724 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8813e275-f23d-497a-a085-0ac6e26ab8c0-log-httpd\") pod \"8813e275-f23d-497a-a085-0ac6e26ab8c0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.617799 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-config-data\") pod \"8813e275-f23d-497a-a085-0ac6e26ab8c0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.617863 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-sg-core-conf-yaml\") pod \"8813e275-f23d-497a-a085-0ac6e26ab8c0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.618329 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8813e275-f23d-497a-a085-0ac6e26ab8c0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8813e275-f23d-497a-a085-0ac6e26ab8c0" (UID: "8813e275-f23d-497a-a085-0ac6e26ab8c0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.618673 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-ceilometer-tls-certs\") pod \"8813e275-f23d-497a-a085-0ac6e26ab8c0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.618752 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8813e275-f23d-497a-a085-0ac6e26ab8c0-run-httpd\") pod \"8813e275-f23d-497a-a085-0ac6e26ab8c0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.618781 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-scripts\") pod \"8813e275-f23d-497a-a085-0ac6e26ab8c0\" (UID: \"8813e275-f23d-497a-a085-0ac6e26ab8c0\") " Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.618974 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8813e275-f23d-497a-a085-0ac6e26ab8c0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8813e275-f23d-497a-a085-0ac6e26ab8c0" (UID: "8813e275-f23d-497a-a085-0ac6e26ab8c0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.619609 4804 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8813e275-f23d-497a-a085-0ac6e26ab8c0-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.619625 4804 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8813e275-f23d-497a-a085-0ac6e26ab8c0-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.632070 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-scripts" (OuterVolumeSpecName: "scripts") pod "8813e275-f23d-497a-a085-0ac6e26ab8c0" (UID: "8813e275-f23d-497a-a085-0ac6e26ab8c0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.643008 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8813e275-f23d-497a-a085-0ac6e26ab8c0-kube-api-access-24pj6" (OuterVolumeSpecName: "kube-api-access-24pj6") pod "8813e275-f23d-497a-a085-0ac6e26ab8c0" (UID: "8813e275-f23d-497a-a085-0ac6e26ab8c0"). InnerVolumeSpecName "kube-api-access-24pj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.656856 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8813e275-f23d-497a-a085-0ac6e26ab8c0" (UID: "8813e275-f23d-497a-a085-0ac6e26ab8c0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.680789 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "8813e275-f23d-497a-a085-0ac6e26ab8c0" (UID: "8813e275-f23d-497a-a085-0ac6e26ab8c0"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.701874 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8813e275-f23d-497a-a085-0ac6e26ab8c0" (UID: "8813e275-f23d-497a-a085-0ac6e26ab8c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.721111 4804 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.721149 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.721164 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24pj6\" (UniqueName: \"kubernetes.io/projected/8813e275-f23d-497a-a085-0ac6e26ab8c0-kube-api-access-24pj6\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.721178 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.721187 4804 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.725163 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-config-data" (OuterVolumeSpecName: "config-data") pod "8813e275-f23d-497a-a085-0ac6e26ab8c0" (UID: "8813e275-f23d-497a-a085-0ac6e26ab8c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.793916 4804 generic.go:334] "Generic (PLEG): container finished" podID="8813e275-f23d-497a-a085-0ac6e26ab8c0" containerID="c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac" exitCode=0 Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.793951 4804 generic.go:334] "Generic (PLEG): container finished" podID="8813e275-f23d-497a-a085-0ac6e26ab8c0" containerID="b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec" exitCode=2 Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.793959 4804 generic.go:334] "Generic (PLEG): container finished" podID="8813e275-f23d-497a-a085-0ac6e26ab8c0" containerID="867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363" exitCode=0 Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.793969 4804 generic.go:334] "Generic (PLEG): container finished" podID="8813e275-f23d-497a-a085-0ac6e26ab8c0" containerID="cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855" exitCode=0 Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.793980 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.794034 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8813e275-f23d-497a-a085-0ac6e26ab8c0","Type":"ContainerDied","Data":"c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac"} Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.794092 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8813e275-f23d-497a-a085-0ac6e26ab8c0","Type":"ContainerDied","Data":"b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec"} Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.794112 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8813e275-f23d-497a-a085-0ac6e26ab8c0","Type":"ContainerDied","Data":"867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363"} Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.794127 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8813e275-f23d-497a-a085-0ac6e26ab8c0","Type":"ContainerDied","Data":"cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855"} Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.794140 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8813e275-f23d-497a-a085-0ac6e26ab8c0","Type":"ContainerDied","Data":"4f38f1401a13eb1042fdd0203f8e9dd2f66fd959325271d3b392c37b7cce4024"} Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.794160 4804 scope.go:117] "RemoveContainer" containerID="c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.796441 4804 generic.go:334] "Generic (PLEG): container finished" podID="2d46aa4d-a4d9-4376-8c8f-2dee489f4662" containerID="1eb83a06c1d6b3047f8b48dc4249bd9299bd8e7635ba1278b431336f1af1d8e0" exitCode=0 Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.796479 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2d46aa4d-a4d9-4376-8c8f-2dee489f4662","Type":"ContainerDied","Data":"1eb83a06c1d6b3047f8b48dc4249bd9299bd8e7635ba1278b431336f1af1d8e0"} Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.796509 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2d46aa4d-a4d9-4376-8c8f-2dee489f4662","Type":"ContainerDied","Data":"d1232b5d92c13480d32625bfcaa956d7a1646000084566fa3ede73979577f667"} Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.796574 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.821226 4804 scope.go:117] "RemoveContainer" containerID="b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.822400 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8813e275-f23d-497a-a085-0ac6e26ab8c0-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.829780 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.842374 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.850940 4804 scope.go:117] "RemoveContainer" containerID="867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.855233 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.886041 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 17 13:49:20 crc kubenswrapper[4804]: E0217 13:49:20.886545 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d46aa4d-a4d9-4376-8c8f-2dee489f4662" containerName="nova-api-api" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.886569 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d46aa4d-a4d9-4376-8c8f-2dee489f4662" containerName="nova-api-api" Feb 17 13:49:20 crc kubenswrapper[4804]: E0217 13:49:20.886584 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8813e275-f23d-497a-a085-0ac6e26ab8c0" containerName="sg-core" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.886593 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8813e275-f23d-497a-a085-0ac6e26ab8c0" containerName="sg-core" Feb 17 13:49:20 crc kubenswrapper[4804]: E0217 13:49:20.886606 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8813e275-f23d-497a-a085-0ac6e26ab8c0" containerName="ceilometer-notification-agent" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.886615 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8813e275-f23d-497a-a085-0ac6e26ab8c0" containerName="ceilometer-notification-agent" Feb 17 13:49:20 crc kubenswrapper[4804]: E0217 13:49:20.886625 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8813e275-f23d-497a-a085-0ac6e26ab8c0" containerName="proxy-httpd" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.886633 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8813e275-f23d-497a-a085-0ac6e26ab8c0" containerName="proxy-httpd" Feb 17 13:49:20 crc kubenswrapper[4804]: E0217 13:49:20.886670 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d46aa4d-a4d9-4376-8c8f-2dee489f4662" containerName="nova-api-log" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.886677 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d46aa4d-a4d9-4376-8c8f-2dee489f4662" containerName="nova-api-log" Feb 17 13:49:20 crc kubenswrapper[4804]: E0217 13:49:20.886691 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8813e275-f23d-497a-a085-0ac6e26ab8c0" containerName="ceilometer-central-agent" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.886698 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8813e275-f23d-497a-a085-0ac6e26ab8c0" containerName="ceilometer-central-agent" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.886908 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8813e275-f23d-497a-a085-0ac6e26ab8c0" containerName="sg-core" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.886925 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d46aa4d-a4d9-4376-8c8f-2dee489f4662" containerName="nova-api-log" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.886947 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8813e275-f23d-497a-a085-0ac6e26ab8c0" containerName="ceilometer-notification-agent" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.886966 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8813e275-f23d-497a-a085-0ac6e26ab8c0" containerName="ceilometer-central-agent" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.886981 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d46aa4d-a4d9-4376-8c8f-2dee489f4662" containerName="nova-api-api" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.886994 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8813e275-f23d-497a-a085-0ac6e26ab8c0" containerName="proxy-httpd" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.888188 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.890040 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.890342 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.891016 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.907419 4804 scope.go:117] "RemoveContainer" containerID="cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.913835 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.923807 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.932489 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.934946 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.935103 4804 scope.go:117] "RemoveContainer" containerID="c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac" Feb 17 13:49:20 crc kubenswrapper[4804]: E0217 13:49:20.936950 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac\": container with ID starting with c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac not found: ID does not exist" containerID="c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.936984 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac"} err="failed to get container status \"c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac\": rpc error: code = NotFound desc = could not find container \"c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac\": container with ID starting with c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac not found: ID does not exist" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.937006 4804 scope.go:117] "RemoveContainer" containerID="b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.937898 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.937927 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 13:49:20 crc kubenswrapper[4804]: E0217 13:49:20.939018 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec\": container with ID starting with b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec not found: ID does not exist" containerID="b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.939053 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec"} err="failed to get container status \"b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec\": rpc error: code = NotFound desc = could not find container \"b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec\": container with ID starting with b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec not found: ID does not exist" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.939076 4804 scope.go:117] "RemoveContainer" containerID="867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363" Feb 17 13:49:20 crc kubenswrapper[4804]: E0217 13:49:20.939345 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363\": container with ID starting with 867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363 not found: ID does not exist" containerID="867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.939375 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363"} err="failed to get container status \"867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363\": rpc error: code = NotFound desc = could not find container \"867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363\": container with ID starting with 867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363 not found: ID does not exist" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.939391 4804 scope.go:117] "RemoveContainer" containerID="cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.939419 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 17 13:49:20 crc kubenswrapper[4804]: E0217 13:49:20.939707 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855\": container with ID starting with cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855 not found: ID does not exist" containerID="cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.939798 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855"} err="failed to get container status \"cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855\": rpc error: code = NotFound desc = could not find container \"cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855\": container with ID starting with cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855 not found: ID does not exist" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.939826 4804 scope.go:117] "RemoveContainer" containerID="c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.940127 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac"} err="failed to get container status \"c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac\": rpc error: code = NotFound desc = could not find container \"c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac\": container with ID starting with c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac not found: ID does not exist" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.940155 4804 scope.go:117] "RemoveContainer" containerID="b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.940439 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec"} err="failed to get container status \"b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec\": rpc error: code = NotFound desc = could not find container \"b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec\": container with ID starting with b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec not found: ID does not exist" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.940463 4804 scope.go:117] "RemoveContainer" containerID="867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.940785 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363"} err="failed to get container status \"867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363\": rpc error: code = NotFound desc = could not find container \"867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363\": container with ID starting with 867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363 not found: ID does not exist" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.940809 4804 scope.go:117] "RemoveContainer" containerID="cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.943645 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.949874 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855"} err="failed to get container status \"cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855\": rpc error: code = NotFound desc = could not find container \"cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855\": container with ID starting with cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855 not found: ID does not exist" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.949930 4804 scope.go:117] "RemoveContainer" containerID="c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.950274 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac"} err="failed to get container status \"c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac\": rpc error: code = NotFound desc = could not find container \"c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac\": container with ID starting with c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac not found: ID does not exist" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.950316 4804 scope.go:117] "RemoveContainer" containerID="b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.950609 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec"} err="failed to get container status \"b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec\": rpc error: code = NotFound desc = could not find container \"b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec\": container with ID starting with b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec not found: ID does not exist" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.950637 4804 scope.go:117] "RemoveContainer" containerID="867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.950881 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363"} err="failed to get container status \"867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363\": rpc error: code = NotFound desc = could not find container \"867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363\": container with ID starting with 867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363 not found: ID does not exist" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.950914 4804 scope.go:117] "RemoveContainer" containerID="cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.951125 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855"} err="failed to get container status \"cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855\": rpc error: code = NotFound desc = could not find container \"cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855\": container with ID starting with cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855 not found: ID does not exist" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.951150 4804 scope.go:117] "RemoveContainer" containerID="c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.951443 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac"} err="failed to get container status \"c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac\": rpc error: code = NotFound desc = could not find container \"c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac\": container with ID starting with c691b2d7a06e6bbbd752dcc115c61a85165b9f16e18320a85fb7197b0ed096ac not found: ID does not exist" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.951501 4804 scope.go:117] "RemoveContainer" containerID="b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.951749 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec"} err="failed to get container status \"b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec\": rpc error: code = NotFound desc = could not find container \"b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec\": container with ID starting with b81a5c1f65e31455bfc74b8b17ccd6d26e26b290a723b754900c2fb9634154ec not found: ID does not exist" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.951773 4804 scope.go:117] "RemoveContainer" containerID="867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.951941 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363"} err="failed to get container status \"867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363\": rpc error: code = NotFound desc = could not find container \"867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363\": container with ID starting with 867624c7f4f676f2c611b23218c2e4bd91a1433f96704f184a5d1a41efab2363 not found: ID does not exist" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.951964 4804 scope.go:117] "RemoveContainer" containerID="cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.952143 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855"} err="failed to get container status \"cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855\": rpc error: code = NotFound desc = could not find container \"cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855\": container with ID starting with cf51c95dfd48c241488ac58a3cd79cec7c1c5a16f42ea4e0a8ea8593d6398855 not found: ID does not exist" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.952162 4804 scope.go:117] "RemoveContainer" containerID="1eb83a06c1d6b3047f8b48dc4249bd9299bd8e7635ba1278b431336f1af1d8e0" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.975081 4804 scope.go:117] "RemoveContainer" containerID="e7380779b9b488b67f4196e0c790ee76a2f4da6daa1378da1dcee7f63e530bfd" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.990988 4804 scope.go:117] "RemoveContainer" containerID="1eb83a06c1d6b3047f8b48dc4249bd9299bd8e7635ba1278b431336f1af1d8e0" Feb 17 13:49:20 crc kubenswrapper[4804]: E0217 13:49:20.991567 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1eb83a06c1d6b3047f8b48dc4249bd9299bd8e7635ba1278b431336f1af1d8e0\": container with ID starting with 1eb83a06c1d6b3047f8b48dc4249bd9299bd8e7635ba1278b431336f1af1d8e0 not found: ID does not exist" containerID="1eb83a06c1d6b3047f8b48dc4249bd9299bd8e7635ba1278b431336f1af1d8e0" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.991632 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eb83a06c1d6b3047f8b48dc4249bd9299bd8e7635ba1278b431336f1af1d8e0"} err="failed to get container status \"1eb83a06c1d6b3047f8b48dc4249bd9299bd8e7635ba1278b431336f1af1d8e0\": rpc error: code = NotFound desc = could not find container \"1eb83a06c1d6b3047f8b48dc4249bd9299bd8e7635ba1278b431336f1af1d8e0\": container with ID starting with 1eb83a06c1d6b3047f8b48dc4249bd9299bd8e7635ba1278b431336f1af1d8e0 not found: ID does not exist" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.991660 4804 scope.go:117] "RemoveContainer" containerID="e7380779b9b488b67f4196e0c790ee76a2f4da6daa1378da1dcee7f63e530bfd" Feb 17 13:49:20 crc kubenswrapper[4804]: E0217 13:49:20.992041 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7380779b9b488b67f4196e0c790ee76a2f4da6daa1378da1dcee7f63e530bfd\": container with ID starting with e7380779b9b488b67f4196e0c790ee76a2f4da6daa1378da1dcee7f63e530bfd not found: ID does not exist" containerID="e7380779b9b488b67f4196e0c790ee76a2f4da6daa1378da1dcee7f63e530bfd" Feb 17 13:49:20 crc kubenswrapper[4804]: I0217 13:49:20.992073 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7380779b9b488b67f4196e0c790ee76a2f4da6daa1378da1dcee7f63e530bfd"} err="failed to get container status \"e7380779b9b488b67f4196e0c790ee76a2f4da6daa1378da1dcee7f63e530bfd\": rpc error: code = NotFound desc = could not find container \"e7380779b9b488b67f4196e0c790ee76a2f4da6daa1378da1dcee7f63e530bfd\": container with ID starting with e7380779b9b488b67f4196e0c790ee76a2f4da6daa1378da1dcee7f63e530bfd not found: ID does not exist" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.027295 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39bfc426-b9af-40b4-a713-26bb2366db7a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.027350 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39bfc426-b9af-40b4-a713-26bb2366db7a-log-httpd\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.027382 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39bfc426-b9af-40b4-a713-26bb2366db7a-run-httpd\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.027408 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/39bfc426-b9af-40b4-a713-26bb2366db7a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.027549 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39bfc426-b9af-40b4-a713-26bb2366db7a-config-data\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.027574 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7mzz\" (UniqueName: \"kubernetes.io/projected/39bfc426-b9af-40b4-a713-26bb2366db7a-kube-api-access-l7mzz\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.027603 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-config-data\") pod \"nova-api-0\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " pod="openstack/nova-api-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.027665 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39bfc426-b9af-40b4-a713-26bb2366db7a-scripts\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.027719 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/39bfc426-b9af-40b4-a713-26bb2366db7a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.027763 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3be0a823-7437-40f0-977e-0ceab74013ea-logs\") pod \"nova-api-0\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " pod="openstack/nova-api-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.027930 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2479\" (UniqueName: \"kubernetes.io/projected/3be0a823-7437-40f0-977e-0ceab74013ea-kube-api-access-m2479\") pod \"nova-api-0\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " pod="openstack/nova-api-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.027987 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " pod="openstack/nova-api-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.028052 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-public-tls-certs\") pod \"nova-api-0\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " pod="openstack/nova-api-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.028066 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " pod="openstack/nova-api-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.129795 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2479\" (UniqueName: \"kubernetes.io/projected/3be0a823-7437-40f0-977e-0ceab74013ea-kube-api-access-m2479\") pod \"nova-api-0\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " pod="openstack/nova-api-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.129867 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " pod="openstack/nova-api-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.129901 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-public-tls-certs\") pod \"nova-api-0\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " pod="openstack/nova-api-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.129932 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " pod="openstack/nova-api-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.129983 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39bfc426-b9af-40b4-a713-26bb2366db7a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.130008 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39bfc426-b9af-40b4-a713-26bb2366db7a-log-httpd\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.130034 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39bfc426-b9af-40b4-a713-26bb2366db7a-run-httpd\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.130064 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/39bfc426-b9af-40b4-a713-26bb2366db7a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.130157 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39bfc426-b9af-40b4-a713-26bb2366db7a-config-data\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.130187 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7mzz\" (UniqueName: \"kubernetes.io/projected/39bfc426-b9af-40b4-a713-26bb2366db7a-kube-api-access-l7mzz\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.130255 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-config-data\") pod \"nova-api-0\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " pod="openstack/nova-api-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.130290 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39bfc426-b9af-40b4-a713-26bb2366db7a-scripts\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.130343 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/39bfc426-b9af-40b4-a713-26bb2366db7a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.130382 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3be0a823-7437-40f0-977e-0ceab74013ea-logs\") pod \"nova-api-0\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " pod="openstack/nova-api-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.130566 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39bfc426-b9af-40b4-a713-26bb2366db7a-run-httpd\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.130658 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39bfc426-b9af-40b4-a713-26bb2366db7a-log-httpd\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.131020 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3be0a823-7437-40f0-977e-0ceab74013ea-logs\") pod \"nova-api-0\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " pod="openstack/nova-api-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.134225 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39bfc426-b9af-40b4-a713-26bb2366db7a-scripts\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.134583 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39bfc426-b9af-40b4-a713-26bb2366db7a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.134584 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39bfc426-b9af-40b4-a713-26bb2366db7a-config-data\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.135969 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-public-tls-certs\") pod \"nova-api-0\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " pod="openstack/nova-api-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.136172 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/39bfc426-b9af-40b4-a713-26bb2366db7a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.136616 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " pod="openstack/nova-api-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.144379 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/39bfc426-b9af-40b4-a713-26bb2366db7a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.148162 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-config-data\") pod \"nova-api-0\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " pod="openstack/nova-api-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.149314 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " pod="openstack/nova-api-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.152650 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7mzz\" (UniqueName: \"kubernetes.io/projected/39bfc426-b9af-40b4-a713-26bb2366db7a-kube-api-access-l7mzz\") pod \"ceilometer-0\" (UID: \"39bfc426-b9af-40b4-a713-26bb2366db7a\") " pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.168776 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2479\" (UniqueName: \"kubernetes.io/projected/3be0a823-7437-40f0-977e-0ceab74013ea-kube-api-access-m2479\") pod \"nova-api-0\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " pod="openstack/nova-api-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.212816 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.266844 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.758440 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.820492 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3be0a823-7437-40f0-977e-0ceab74013ea","Type":"ContainerStarted","Data":"5ed23a96a0046c231b096d9ec7822cad0493113fc209be6092efb93ac3aeb1f1"} Feb 17 13:49:21 crc kubenswrapper[4804]: I0217 13:49:21.835086 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 13:49:22 crc kubenswrapper[4804]: I0217 13:49:22.059167 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:22 crc kubenswrapper[4804]: I0217 13:49:22.089152 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:22 crc kubenswrapper[4804]: I0217 13:49:22.583567 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d46aa4d-a4d9-4376-8c8f-2dee489f4662" path="/var/lib/kubelet/pods/2d46aa4d-a4d9-4376-8c8f-2dee489f4662/volumes" Feb 17 13:49:22 crc kubenswrapper[4804]: I0217 13:49:22.584702 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8813e275-f23d-497a-a085-0ac6e26ab8c0" path="/var/lib/kubelet/pods/8813e275-f23d-497a-a085-0ac6e26ab8c0/volumes" Feb 17 13:49:22 crc kubenswrapper[4804]: I0217 13:49:22.838064 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3be0a823-7437-40f0-977e-0ceab74013ea","Type":"ContainerStarted","Data":"db23519ab417745161f2a8210663f78115db406134b1d292d577a60f6b661cab"} Feb 17 13:49:22 crc kubenswrapper[4804]: I0217 13:49:22.838100 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3be0a823-7437-40f0-977e-0ceab74013ea","Type":"ContainerStarted","Data":"93ebca2ea5d281cf42180c5e42f24497d52ebdef206d3b1e5f1007c415bb306d"} Feb 17 13:49:22 crc kubenswrapper[4804]: I0217 13:49:22.840041 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39bfc426-b9af-40b4-a713-26bb2366db7a","Type":"ContainerStarted","Data":"9690be6c1002861bf0de390b3bd8f555e7e19daaea327060a786b5824e8f9b73"} Feb 17 13:49:22 crc kubenswrapper[4804]: I0217 13:49:22.840078 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39bfc426-b9af-40b4-a713-26bb2366db7a","Type":"ContainerStarted","Data":"e45c40f839a3f2cb44fd1aa6071f5e73e3e01da151a9e2b51aad903b7284b659"} Feb 17 13:49:22 crc kubenswrapper[4804]: I0217 13:49:22.858256 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.8582378029999997 podStartE2EDuration="2.858237803s" podCreationTimestamp="2026-02-17 13:49:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:49:22.85560921 +0000 UTC m=+1436.967028567" watchObservedRunningTime="2026-02-17 13:49:22.858237803 +0000 UTC m=+1436.969657140" Feb 17 13:49:22 crc kubenswrapper[4804]: I0217 13:49:22.860699 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.007443 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-s8qtz"] Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.009246 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-s8qtz" Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.011442 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.011601 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.014286 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-s8qtz"] Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.189433 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b6d06cb-8252-4c27-815b-1f09a217cbb4-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-s8qtz\" (UID: \"0b6d06cb-8252-4c27-815b-1f09a217cbb4\") " pod="openstack/nova-cell1-cell-mapping-s8qtz" Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.189605 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b6d06cb-8252-4c27-815b-1f09a217cbb4-scripts\") pod \"nova-cell1-cell-mapping-s8qtz\" (UID: \"0b6d06cb-8252-4c27-815b-1f09a217cbb4\") " pod="openstack/nova-cell1-cell-mapping-s8qtz" Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.189650 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b6d06cb-8252-4c27-815b-1f09a217cbb4-config-data\") pod \"nova-cell1-cell-mapping-s8qtz\" (UID: \"0b6d06cb-8252-4c27-815b-1f09a217cbb4\") " pod="openstack/nova-cell1-cell-mapping-s8qtz" Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.189684 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f5d8\" (UniqueName: \"kubernetes.io/projected/0b6d06cb-8252-4c27-815b-1f09a217cbb4-kube-api-access-8f5d8\") pod \"nova-cell1-cell-mapping-s8qtz\" (UID: \"0b6d06cb-8252-4c27-815b-1f09a217cbb4\") " pod="openstack/nova-cell1-cell-mapping-s8qtz" Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.291691 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b6d06cb-8252-4c27-815b-1f09a217cbb4-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-s8qtz\" (UID: \"0b6d06cb-8252-4c27-815b-1f09a217cbb4\") " pod="openstack/nova-cell1-cell-mapping-s8qtz" Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.291828 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b6d06cb-8252-4c27-815b-1f09a217cbb4-scripts\") pod \"nova-cell1-cell-mapping-s8qtz\" (UID: \"0b6d06cb-8252-4c27-815b-1f09a217cbb4\") " pod="openstack/nova-cell1-cell-mapping-s8qtz" Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.291884 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b6d06cb-8252-4c27-815b-1f09a217cbb4-config-data\") pod \"nova-cell1-cell-mapping-s8qtz\" (UID: \"0b6d06cb-8252-4c27-815b-1f09a217cbb4\") " pod="openstack/nova-cell1-cell-mapping-s8qtz" Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.291917 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f5d8\" (UniqueName: \"kubernetes.io/projected/0b6d06cb-8252-4c27-815b-1f09a217cbb4-kube-api-access-8f5d8\") pod \"nova-cell1-cell-mapping-s8qtz\" (UID: \"0b6d06cb-8252-4c27-815b-1f09a217cbb4\") " pod="openstack/nova-cell1-cell-mapping-s8qtz" Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.295444 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b6d06cb-8252-4c27-815b-1f09a217cbb4-scripts\") pod \"nova-cell1-cell-mapping-s8qtz\" (UID: \"0b6d06cb-8252-4c27-815b-1f09a217cbb4\") " pod="openstack/nova-cell1-cell-mapping-s8qtz" Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.296928 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b6d06cb-8252-4c27-815b-1f09a217cbb4-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-s8qtz\" (UID: \"0b6d06cb-8252-4c27-815b-1f09a217cbb4\") " pod="openstack/nova-cell1-cell-mapping-s8qtz" Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.297385 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b6d06cb-8252-4c27-815b-1f09a217cbb4-config-data\") pod \"nova-cell1-cell-mapping-s8qtz\" (UID: \"0b6d06cb-8252-4c27-815b-1f09a217cbb4\") " pod="openstack/nova-cell1-cell-mapping-s8qtz" Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.313980 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f5d8\" (UniqueName: \"kubernetes.io/projected/0b6d06cb-8252-4c27-815b-1f09a217cbb4-kube-api-access-8f5d8\") pod \"nova-cell1-cell-mapping-s8qtz\" (UID: \"0b6d06cb-8252-4c27-815b-1f09a217cbb4\") " pod="openstack/nova-cell1-cell-mapping-s8qtz" Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.327406 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-s8qtz" Feb 17 13:49:23 crc kubenswrapper[4804]: W0217 13:49:23.757328 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b6d06cb_8252_4c27_815b_1f09a217cbb4.slice/crio-9e18f0732edbd50ab3a8f7a1d0aecc6c93c903f530d5e5741b73a3b028b8726e WatchSource:0}: Error finding container 9e18f0732edbd50ab3a8f7a1d0aecc6c93c903f530d5e5741b73a3b028b8726e: Status 404 returned error can't find the container with id 9e18f0732edbd50ab3a8f7a1d0aecc6c93c903f530d5e5741b73a3b028b8726e Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.763640 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-s8qtz"] Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.850394 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-s8qtz" event={"ID":"0b6d06cb-8252-4c27-815b-1f09a217cbb4","Type":"ContainerStarted","Data":"9e18f0732edbd50ab3a8f7a1d0aecc6c93c903f530d5e5741b73a3b028b8726e"} Feb 17 13:49:23 crc kubenswrapper[4804]: I0217 13:49:23.853602 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39bfc426-b9af-40b4-a713-26bb2366db7a","Type":"ContainerStarted","Data":"b6ecb0fb21c7e0514abe1a06cb934090aae05aa720515eef411bf85a5f5bd522"} Feb 17 13:49:24 crc kubenswrapper[4804]: I0217 13:49:24.235311 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:49:24 crc kubenswrapper[4804]: I0217 13:49:24.309255 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-nm74r"] Feb 17 13:49:24 crc kubenswrapper[4804]: I0217 13:49:24.309558 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bccf8f775-nm74r" podUID="59bcfea7-54c7-4d99-afa8-e48fd8d7ee70" containerName="dnsmasq-dns" containerID="cri-o://80c53b35d48d085e0876f20fe6ad287ecfaf18dbb36d6aa286cf573185b619c2" gracePeriod=10 Feb 17 13:49:24 crc kubenswrapper[4804]: I0217 13:49:24.831241 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:49:24 crc kubenswrapper[4804]: I0217 13:49:24.868171 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-s8qtz" event={"ID":"0b6d06cb-8252-4c27-815b-1f09a217cbb4","Type":"ContainerStarted","Data":"3edaad49062f52adf5c7194a9baff45d7b6f8571728650127dd710028add6529"} Feb 17 13:49:24 crc kubenswrapper[4804]: I0217 13:49:24.873053 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39bfc426-b9af-40b4-a713-26bb2366db7a","Type":"ContainerStarted","Data":"142a6be816a205390c7d052cb7f3cc8b7e1d745dd99eb99bbf503bec3bc6c60f"} Feb 17 13:49:24 crc kubenswrapper[4804]: I0217 13:49:24.883105 4804 generic.go:334] "Generic (PLEG): container finished" podID="59bcfea7-54c7-4d99-afa8-e48fd8d7ee70" containerID="80c53b35d48d085e0876f20fe6ad287ecfaf18dbb36d6aa286cf573185b619c2" exitCode=0 Feb 17 13:49:24 crc kubenswrapper[4804]: I0217 13:49:24.883147 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-nm74r" event={"ID":"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70","Type":"ContainerDied","Data":"80c53b35d48d085e0876f20fe6ad287ecfaf18dbb36d6aa286cf573185b619c2"} Feb 17 13:49:24 crc kubenswrapper[4804]: I0217 13:49:24.883172 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-nm74r" event={"ID":"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70","Type":"ContainerDied","Data":"9ac5534fef55ed02d86af4d8912cb72f23f77c2e384ce39f866abb0e39f803e5"} Feb 17 13:49:24 crc kubenswrapper[4804]: I0217 13:49:24.883187 4804 scope.go:117] "RemoveContainer" containerID="80c53b35d48d085e0876f20fe6ad287ecfaf18dbb36d6aa286cf573185b619c2" Feb 17 13:49:24 crc kubenswrapper[4804]: I0217 13:49:24.883686 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-nm74r" Feb 17 13:49:24 crc kubenswrapper[4804]: I0217 13:49:24.890904 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-s8qtz" podStartSLOduration=2.890888857 podStartE2EDuration="2.890888857s" podCreationTimestamp="2026-02-17 13:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:49:24.885327012 +0000 UTC m=+1438.996746349" watchObservedRunningTime="2026-02-17 13:49:24.890888857 +0000 UTC m=+1439.002308194" Feb 17 13:49:24 crc kubenswrapper[4804]: I0217 13:49:24.911639 4804 scope.go:117] "RemoveContainer" containerID="11ccb1e04351ba07a549d61417358dec2218d53f646147b72efa0ee2b3fc4654" Feb 17 13:49:24 crc kubenswrapper[4804]: I0217 13:49:24.940722 4804 scope.go:117] "RemoveContainer" containerID="80c53b35d48d085e0876f20fe6ad287ecfaf18dbb36d6aa286cf573185b619c2" Feb 17 13:49:24 crc kubenswrapper[4804]: E0217 13:49:24.942506 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80c53b35d48d085e0876f20fe6ad287ecfaf18dbb36d6aa286cf573185b619c2\": container with ID starting with 80c53b35d48d085e0876f20fe6ad287ecfaf18dbb36d6aa286cf573185b619c2 not found: ID does not exist" containerID="80c53b35d48d085e0876f20fe6ad287ecfaf18dbb36d6aa286cf573185b619c2" Feb 17 13:49:24 crc kubenswrapper[4804]: I0217 13:49:24.942545 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80c53b35d48d085e0876f20fe6ad287ecfaf18dbb36d6aa286cf573185b619c2"} err="failed to get container status \"80c53b35d48d085e0876f20fe6ad287ecfaf18dbb36d6aa286cf573185b619c2\": rpc error: code = NotFound desc = could not find container \"80c53b35d48d085e0876f20fe6ad287ecfaf18dbb36d6aa286cf573185b619c2\": container with ID starting with 80c53b35d48d085e0876f20fe6ad287ecfaf18dbb36d6aa286cf573185b619c2 not found: ID does not exist" Feb 17 13:49:24 crc kubenswrapper[4804]: I0217 13:49:24.942567 4804 scope.go:117] "RemoveContainer" containerID="11ccb1e04351ba07a549d61417358dec2218d53f646147b72efa0ee2b3fc4654" Feb 17 13:49:24 crc kubenswrapper[4804]: E0217 13:49:24.942913 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11ccb1e04351ba07a549d61417358dec2218d53f646147b72efa0ee2b3fc4654\": container with ID starting with 11ccb1e04351ba07a549d61417358dec2218d53f646147b72efa0ee2b3fc4654 not found: ID does not exist" containerID="11ccb1e04351ba07a549d61417358dec2218d53f646147b72efa0ee2b3fc4654" Feb 17 13:49:24 crc kubenswrapper[4804]: I0217 13:49:24.942951 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11ccb1e04351ba07a549d61417358dec2218d53f646147b72efa0ee2b3fc4654"} err="failed to get container status \"11ccb1e04351ba07a549d61417358dec2218d53f646147b72efa0ee2b3fc4654\": rpc error: code = NotFound desc = could not find container \"11ccb1e04351ba07a549d61417358dec2218d53f646147b72efa0ee2b3fc4654\": container with ID starting with 11ccb1e04351ba07a549d61417358dec2218d53f646147b72efa0ee2b3fc4654 not found: ID does not exist" Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.028589 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-config\") pod \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.029272 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-dns-swift-storage-0\") pod \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.029363 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-ovsdbserver-nb\") pod \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.029409 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqn6x\" (UniqueName: \"kubernetes.io/projected/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-kube-api-access-dqn6x\") pod \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.029486 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-ovsdbserver-sb\") pod \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.029540 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-dns-svc\") pod \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\" (UID: \"59bcfea7-54c7-4d99-afa8-e48fd8d7ee70\") " Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.066211 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-kube-api-access-dqn6x" (OuterVolumeSpecName: "kube-api-access-dqn6x") pod "59bcfea7-54c7-4d99-afa8-e48fd8d7ee70" (UID: "59bcfea7-54c7-4d99-afa8-e48fd8d7ee70"). InnerVolumeSpecName "kube-api-access-dqn6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.084011 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-config" (OuterVolumeSpecName: "config") pod "59bcfea7-54c7-4d99-afa8-e48fd8d7ee70" (UID: "59bcfea7-54c7-4d99-afa8-e48fd8d7ee70"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.101093 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "59bcfea7-54c7-4d99-afa8-e48fd8d7ee70" (UID: "59bcfea7-54c7-4d99-afa8-e48fd8d7ee70"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.110501 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "59bcfea7-54c7-4d99-afa8-e48fd8d7ee70" (UID: "59bcfea7-54c7-4d99-afa8-e48fd8d7ee70"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.119254 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "59bcfea7-54c7-4d99-afa8-e48fd8d7ee70" (UID: "59bcfea7-54c7-4d99-afa8-e48fd8d7ee70"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.123903 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "59bcfea7-54c7-4d99-afa8-e48fd8d7ee70" (UID: "59bcfea7-54c7-4d99-afa8-e48fd8d7ee70"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.158521 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.158567 4804 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.158582 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.158592 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqn6x\" (UniqueName: \"kubernetes.io/projected/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-kube-api-access-dqn6x\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.158603 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.158612 4804 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.218733 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-nm74r"] Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.226976 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-nm74r"] Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.835352 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:49:25 crc kubenswrapper[4804]: I0217 13:49:25.835742 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:49:26 crc kubenswrapper[4804]: I0217 13:49:26.586130 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59bcfea7-54c7-4d99-afa8-e48fd8d7ee70" path="/var/lib/kubelet/pods/59bcfea7-54c7-4d99-afa8-e48fd8d7ee70/volumes" Feb 17 13:49:26 crc kubenswrapper[4804]: I0217 13:49:26.912409 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39bfc426-b9af-40b4-a713-26bb2366db7a","Type":"ContainerStarted","Data":"bae793acfed60a7d791f396224f96d981c0c0d8afab0f42ff46c61d5cf3045c4"} Feb 17 13:49:26 crc kubenswrapper[4804]: I0217 13:49:26.912808 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 13:49:26 crc kubenswrapper[4804]: I0217 13:49:26.935142 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.783351529 podStartE2EDuration="6.935124804s" podCreationTimestamp="2026-02-17 13:49:20 +0000 UTC" firstStartedPulling="2026-02-17 13:49:21.831899462 +0000 UTC m=+1435.943318809" lastFinishedPulling="2026-02-17 13:49:25.983672747 +0000 UTC m=+1440.095092084" observedRunningTime="2026-02-17 13:49:26.933111782 +0000 UTC m=+1441.044531129" watchObservedRunningTime="2026-02-17 13:49:26.935124804 +0000 UTC m=+1441.046544141" Feb 17 13:49:28 crc kubenswrapper[4804]: I0217 13:49:28.937109 4804 generic.go:334] "Generic (PLEG): container finished" podID="0b6d06cb-8252-4c27-815b-1f09a217cbb4" containerID="3edaad49062f52adf5c7194a9baff45d7b6f8571728650127dd710028add6529" exitCode=0 Feb 17 13:49:28 crc kubenswrapper[4804]: I0217 13:49:28.937220 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-s8qtz" event={"ID":"0b6d06cb-8252-4c27-815b-1f09a217cbb4","Type":"ContainerDied","Data":"3edaad49062f52adf5c7194a9baff45d7b6f8571728650127dd710028add6529"} Feb 17 13:49:30 crc kubenswrapper[4804]: I0217 13:49:30.279561 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-s8qtz" Feb 17 13:49:30 crc kubenswrapper[4804]: I0217 13:49:30.349803 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b6d06cb-8252-4c27-815b-1f09a217cbb4-combined-ca-bundle\") pod \"0b6d06cb-8252-4c27-815b-1f09a217cbb4\" (UID: \"0b6d06cb-8252-4c27-815b-1f09a217cbb4\") " Feb 17 13:49:30 crc kubenswrapper[4804]: I0217 13:49:30.349895 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8f5d8\" (UniqueName: \"kubernetes.io/projected/0b6d06cb-8252-4c27-815b-1f09a217cbb4-kube-api-access-8f5d8\") pod \"0b6d06cb-8252-4c27-815b-1f09a217cbb4\" (UID: \"0b6d06cb-8252-4c27-815b-1f09a217cbb4\") " Feb 17 13:49:30 crc kubenswrapper[4804]: I0217 13:49:30.349942 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b6d06cb-8252-4c27-815b-1f09a217cbb4-scripts\") pod \"0b6d06cb-8252-4c27-815b-1f09a217cbb4\" (UID: \"0b6d06cb-8252-4c27-815b-1f09a217cbb4\") " Feb 17 13:49:30 crc kubenswrapper[4804]: I0217 13:49:30.350013 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b6d06cb-8252-4c27-815b-1f09a217cbb4-config-data\") pod \"0b6d06cb-8252-4c27-815b-1f09a217cbb4\" (UID: \"0b6d06cb-8252-4c27-815b-1f09a217cbb4\") " Feb 17 13:49:30 crc kubenswrapper[4804]: I0217 13:49:30.355322 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b6d06cb-8252-4c27-815b-1f09a217cbb4-kube-api-access-8f5d8" (OuterVolumeSpecName: "kube-api-access-8f5d8") pod "0b6d06cb-8252-4c27-815b-1f09a217cbb4" (UID: "0b6d06cb-8252-4c27-815b-1f09a217cbb4"). InnerVolumeSpecName "kube-api-access-8f5d8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:49:30 crc kubenswrapper[4804]: I0217 13:49:30.359890 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b6d06cb-8252-4c27-815b-1f09a217cbb4-scripts" (OuterVolumeSpecName: "scripts") pod "0b6d06cb-8252-4c27-815b-1f09a217cbb4" (UID: "0b6d06cb-8252-4c27-815b-1f09a217cbb4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:30 crc kubenswrapper[4804]: I0217 13:49:30.378537 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b6d06cb-8252-4c27-815b-1f09a217cbb4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b6d06cb-8252-4c27-815b-1f09a217cbb4" (UID: "0b6d06cb-8252-4c27-815b-1f09a217cbb4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:30 crc kubenswrapper[4804]: I0217 13:49:30.380698 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b6d06cb-8252-4c27-815b-1f09a217cbb4-config-data" (OuterVolumeSpecName: "config-data") pod "0b6d06cb-8252-4c27-815b-1f09a217cbb4" (UID: "0b6d06cb-8252-4c27-815b-1f09a217cbb4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:30 crc kubenswrapper[4804]: I0217 13:49:30.452366 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b6d06cb-8252-4c27-815b-1f09a217cbb4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:30 crc kubenswrapper[4804]: I0217 13:49:30.452491 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8f5d8\" (UniqueName: \"kubernetes.io/projected/0b6d06cb-8252-4c27-815b-1f09a217cbb4-kube-api-access-8f5d8\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:30 crc kubenswrapper[4804]: I0217 13:49:30.452509 4804 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b6d06cb-8252-4c27-815b-1f09a217cbb4-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:30 crc kubenswrapper[4804]: I0217 13:49:30.452520 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b6d06cb-8252-4c27-815b-1f09a217cbb4-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:30 crc kubenswrapper[4804]: I0217 13:49:30.961462 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-s8qtz" event={"ID":"0b6d06cb-8252-4c27-815b-1f09a217cbb4","Type":"ContainerDied","Data":"9e18f0732edbd50ab3a8f7a1d0aecc6c93c903f530d5e5741b73a3b028b8726e"} Feb 17 13:49:30 crc kubenswrapper[4804]: I0217 13:49:30.961524 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e18f0732edbd50ab3a8f7a1d0aecc6c93c903f530d5e5741b73a3b028b8726e" Feb 17 13:49:30 crc kubenswrapper[4804]: I0217 13:49:30.961581 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-s8qtz" Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.153327 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.154246 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3be0a823-7437-40f0-977e-0ceab74013ea" containerName="nova-api-log" containerID="cri-o://93ebca2ea5d281cf42180c5e42f24497d52ebdef206d3b1e5f1007c415bb306d" gracePeriod=30 Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.154319 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3be0a823-7437-40f0-977e-0ceab74013ea" containerName="nova-api-api" containerID="cri-o://db23519ab417745161f2a8210663f78115db406134b1d292d577a60f6b661cab" gracePeriod=30 Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.165412 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.165659 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="abf6d166-4c3f-4fb3-a3b5-f85d47adf823" containerName="nova-scheduler-scheduler" containerID="cri-o://74abf0adfaf756459596c8077a84021698f54ece715646599a6f465fcabf500a" gracePeriod=30 Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.200125 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.200416 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="aa87191a-671d-43c8-b8c2-e5e07a54af02" containerName="nova-metadata-log" containerID="cri-o://10192fda379cac904a9add75091edd3fa4788176b8f596974a82c2b3ace68679" gracePeriod=30 Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.200491 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="aa87191a-671d-43c8-b8c2-e5e07a54af02" containerName="nova-metadata-metadata" containerID="cri-o://f14b58f1aab61a4641abd8e2099263a3e20347428156e830e60d2bb792f1efe5" gracePeriod=30 Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.741455 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.779111 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2479\" (UniqueName: \"kubernetes.io/projected/3be0a823-7437-40f0-977e-0ceab74013ea-kube-api-access-m2479\") pod \"3be0a823-7437-40f0-977e-0ceab74013ea\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.779191 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-config-data\") pod \"3be0a823-7437-40f0-977e-0ceab74013ea\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.779268 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-public-tls-certs\") pod \"3be0a823-7437-40f0-977e-0ceab74013ea\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.779354 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3be0a823-7437-40f0-977e-0ceab74013ea-logs\") pod \"3be0a823-7437-40f0-977e-0ceab74013ea\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.779392 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-combined-ca-bundle\") pod \"3be0a823-7437-40f0-977e-0ceab74013ea\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.779429 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-internal-tls-certs\") pod \"3be0a823-7437-40f0-977e-0ceab74013ea\" (UID: \"3be0a823-7437-40f0-977e-0ceab74013ea\") " Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.794247 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3be0a823-7437-40f0-977e-0ceab74013ea-logs" (OuterVolumeSpecName: "logs") pod "3be0a823-7437-40f0-977e-0ceab74013ea" (UID: "3be0a823-7437-40f0-977e-0ceab74013ea"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.794526 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3be0a823-7437-40f0-977e-0ceab74013ea-kube-api-access-m2479" (OuterVolumeSpecName: "kube-api-access-m2479") pod "3be0a823-7437-40f0-977e-0ceab74013ea" (UID: "3be0a823-7437-40f0-977e-0ceab74013ea"). InnerVolumeSpecName "kube-api-access-m2479". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.824713 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3be0a823-7437-40f0-977e-0ceab74013ea" (UID: "3be0a823-7437-40f0-977e-0ceab74013ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.826958 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-config-data" (OuterVolumeSpecName: "config-data") pod "3be0a823-7437-40f0-977e-0ceab74013ea" (UID: "3be0a823-7437-40f0-977e-0ceab74013ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.841476 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3be0a823-7437-40f0-977e-0ceab74013ea" (UID: "3be0a823-7437-40f0-977e-0ceab74013ea"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.859506 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3be0a823-7437-40f0-977e-0ceab74013ea" (UID: "3be0a823-7437-40f0-977e-0ceab74013ea"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.881770 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2479\" (UniqueName: \"kubernetes.io/projected/3be0a823-7437-40f0-977e-0ceab74013ea-kube-api-access-m2479\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.881810 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.881824 4804 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.881839 4804 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3be0a823-7437-40f0-977e-0ceab74013ea-logs\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.881851 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.881861 4804 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3be0a823-7437-40f0-977e-0ceab74013ea-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.971999 4804 generic.go:334] "Generic (PLEG): container finished" podID="3be0a823-7437-40f0-977e-0ceab74013ea" containerID="db23519ab417745161f2a8210663f78115db406134b1d292d577a60f6b661cab" exitCode=0 Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.972030 4804 generic.go:334] "Generic (PLEG): container finished" podID="3be0a823-7437-40f0-977e-0ceab74013ea" containerID="93ebca2ea5d281cf42180c5e42f24497d52ebdef206d3b1e5f1007c415bb306d" exitCode=143 Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.972082 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3be0a823-7437-40f0-977e-0ceab74013ea","Type":"ContainerDied","Data":"db23519ab417745161f2a8210663f78115db406134b1d292d577a60f6b661cab"} Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.972115 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3be0a823-7437-40f0-977e-0ceab74013ea","Type":"ContainerDied","Data":"93ebca2ea5d281cf42180c5e42f24497d52ebdef206d3b1e5f1007c415bb306d"} Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.972131 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3be0a823-7437-40f0-977e-0ceab74013ea","Type":"ContainerDied","Data":"5ed23a96a0046c231b096d9ec7822cad0493113fc209be6092efb93ac3aeb1f1"} Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.972143 4804 scope.go:117] "RemoveContainer" containerID="db23519ab417745161f2a8210663f78115db406134b1d292d577a60f6b661cab" Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.972534 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.978260 4804 generic.go:334] "Generic (PLEG): container finished" podID="aa87191a-671d-43c8-b8c2-e5e07a54af02" containerID="10192fda379cac904a9add75091edd3fa4788176b8f596974a82c2b3ace68679" exitCode=143 Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.978298 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aa87191a-671d-43c8-b8c2-e5e07a54af02","Type":"ContainerDied","Data":"10192fda379cac904a9add75091edd3fa4788176b8f596974a82c2b3ace68679"} Feb 17 13:49:31 crc kubenswrapper[4804]: I0217 13:49:31.993262 4804 scope.go:117] "RemoveContainer" containerID="93ebca2ea5d281cf42180c5e42f24497d52ebdef206d3b1e5f1007c415bb306d" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.016544 4804 scope.go:117] "RemoveContainer" containerID="db23519ab417745161f2a8210663f78115db406134b1d292d577a60f6b661cab" Feb 17 13:49:32 crc kubenswrapper[4804]: E0217 13:49:32.016835 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db23519ab417745161f2a8210663f78115db406134b1d292d577a60f6b661cab\": container with ID starting with db23519ab417745161f2a8210663f78115db406134b1d292d577a60f6b661cab not found: ID does not exist" containerID="db23519ab417745161f2a8210663f78115db406134b1d292d577a60f6b661cab" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.016868 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db23519ab417745161f2a8210663f78115db406134b1d292d577a60f6b661cab"} err="failed to get container status \"db23519ab417745161f2a8210663f78115db406134b1d292d577a60f6b661cab\": rpc error: code = NotFound desc = could not find container \"db23519ab417745161f2a8210663f78115db406134b1d292d577a60f6b661cab\": container with ID starting with db23519ab417745161f2a8210663f78115db406134b1d292d577a60f6b661cab not found: ID does not exist" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.016888 4804 scope.go:117] "RemoveContainer" containerID="93ebca2ea5d281cf42180c5e42f24497d52ebdef206d3b1e5f1007c415bb306d" Feb 17 13:49:32 crc kubenswrapper[4804]: E0217 13:49:32.017071 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93ebca2ea5d281cf42180c5e42f24497d52ebdef206d3b1e5f1007c415bb306d\": container with ID starting with 93ebca2ea5d281cf42180c5e42f24497d52ebdef206d3b1e5f1007c415bb306d not found: ID does not exist" containerID="93ebca2ea5d281cf42180c5e42f24497d52ebdef206d3b1e5f1007c415bb306d" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.017094 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93ebca2ea5d281cf42180c5e42f24497d52ebdef206d3b1e5f1007c415bb306d"} err="failed to get container status \"93ebca2ea5d281cf42180c5e42f24497d52ebdef206d3b1e5f1007c415bb306d\": rpc error: code = NotFound desc = could not find container \"93ebca2ea5d281cf42180c5e42f24497d52ebdef206d3b1e5f1007c415bb306d\": container with ID starting with 93ebca2ea5d281cf42180c5e42f24497d52ebdef206d3b1e5f1007c415bb306d not found: ID does not exist" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.017106 4804 scope.go:117] "RemoveContainer" containerID="db23519ab417745161f2a8210663f78115db406134b1d292d577a60f6b661cab" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.017291 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db23519ab417745161f2a8210663f78115db406134b1d292d577a60f6b661cab"} err="failed to get container status \"db23519ab417745161f2a8210663f78115db406134b1d292d577a60f6b661cab\": rpc error: code = NotFound desc = could not find container \"db23519ab417745161f2a8210663f78115db406134b1d292d577a60f6b661cab\": container with ID starting with db23519ab417745161f2a8210663f78115db406134b1d292d577a60f6b661cab not found: ID does not exist" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.017305 4804 scope.go:117] "RemoveContainer" containerID="93ebca2ea5d281cf42180c5e42f24497d52ebdef206d3b1e5f1007c415bb306d" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.017466 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93ebca2ea5d281cf42180c5e42f24497d52ebdef206d3b1e5f1007c415bb306d"} err="failed to get container status \"93ebca2ea5d281cf42180c5e42f24497d52ebdef206d3b1e5f1007c415bb306d\": rpc error: code = NotFound desc = could not find container \"93ebca2ea5d281cf42180c5e42f24497d52ebdef206d3b1e5f1007c415bb306d\": container with ID starting with 93ebca2ea5d281cf42180c5e42f24497d52ebdef206d3b1e5f1007c415bb306d not found: ID does not exist" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.034653 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.052977 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.066006 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 17 13:49:32 crc kubenswrapper[4804]: E0217 13:49:32.066541 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b6d06cb-8252-4c27-815b-1f09a217cbb4" containerName="nova-manage" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.066568 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b6d06cb-8252-4c27-815b-1f09a217cbb4" containerName="nova-manage" Feb 17 13:49:32 crc kubenswrapper[4804]: E0217 13:49:32.066583 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59bcfea7-54c7-4d99-afa8-e48fd8d7ee70" containerName="init" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.066592 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="59bcfea7-54c7-4d99-afa8-e48fd8d7ee70" containerName="init" Feb 17 13:49:32 crc kubenswrapper[4804]: E0217 13:49:32.066607 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3be0a823-7437-40f0-977e-0ceab74013ea" containerName="nova-api-api" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.066615 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="3be0a823-7437-40f0-977e-0ceab74013ea" containerName="nova-api-api" Feb 17 13:49:32 crc kubenswrapper[4804]: E0217 13:49:32.066647 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3be0a823-7437-40f0-977e-0ceab74013ea" containerName="nova-api-log" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.066655 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="3be0a823-7437-40f0-977e-0ceab74013ea" containerName="nova-api-log" Feb 17 13:49:32 crc kubenswrapper[4804]: E0217 13:49:32.066664 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59bcfea7-54c7-4d99-afa8-e48fd8d7ee70" containerName="dnsmasq-dns" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.066670 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="59bcfea7-54c7-4d99-afa8-e48fd8d7ee70" containerName="dnsmasq-dns" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.066890 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="3be0a823-7437-40f0-977e-0ceab74013ea" containerName="nova-api-api" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.066912 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b6d06cb-8252-4c27-815b-1f09a217cbb4" containerName="nova-manage" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.066929 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="59bcfea7-54c7-4d99-afa8-e48fd8d7ee70" containerName="dnsmasq-dns" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.066937 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="3be0a823-7437-40f0-977e-0ceab74013ea" containerName="nova-api-log" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.067906 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.070778 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.071463 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.071662 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.071866 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.085789 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29528202-42d5-4bcd-90e8-335435ba59cf-config-data\") pod \"nova-api-0\" (UID: \"29528202-42d5-4bcd-90e8-335435ba59cf\") " pod="openstack/nova-api-0" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.085905 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6htr8\" (UniqueName: \"kubernetes.io/projected/29528202-42d5-4bcd-90e8-335435ba59cf-kube-api-access-6htr8\") pod \"nova-api-0\" (UID: \"29528202-42d5-4bcd-90e8-335435ba59cf\") " pod="openstack/nova-api-0" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.085963 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29528202-42d5-4bcd-90e8-335435ba59cf-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"29528202-42d5-4bcd-90e8-335435ba59cf\") " pod="openstack/nova-api-0" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.086005 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29528202-42d5-4bcd-90e8-335435ba59cf-logs\") pod \"nova-api-0\" (UID: \"29528202-42d5-4bcd-90e8-335435ba59cf\") " pod="openstack/nova-api-0" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.086046 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29528202-42d5-4bcd-90e8-335435ba59cf-public-tls-certs\") pod \"nova-api-0\" (UID: \"29528202-42d5-4bcd-90e8-335435ba59cf\") " pod="openstack/nova-api-0" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.086104 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29528202-42d5-4bcd-90e8-335435ba59cf-internal-tls-certs\") pod \"nova-api-0\" (UID: \"29528202-42d5-4bcd-90e8-335435ba59cf\") " pod="openstack/nova-api-0" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.187566 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6htr8\" (UniqueName: \"kubernetes.io/projected/29528202-42d5-4bcd-90e8-335435ba59cf-kube-api-access-6htr8\") pod \"nova-api-0\" (UID: \"29528202-42d5-4bcd-90e8-335435ba59cf\") " pod="openstack/nova-api-0" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.187643 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29528202-42d5-4bcd-90e8-335435ba59cf-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"29528202-42d5-4bcd-90e8-335435ba59cf\") " pod="openstack/nova-api-0" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.187680 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29528202-42d5-4bcd-90e8-335435ba59cf-logs\") pod \"nova-api-0\" (UID: \"29528202-42d5-4bcd-90e8-335435ba59cf\") " pod="openstack/nova-api-0" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.187716 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29528202-42d5-4bcd-90e8-335435ba59cf-public-tls-certs\") pod \"nova-api-0\" (UID: \"29528202-42d5-4bcd-90e8-335435ba59cf\") " pod="openstack/nova-api-0" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.187737 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29528202-42d5-4bcd-90e8-335435ba59cf-internal-tls-certs\") pod \"nova-api-0\" (UID: \"29528202-42d5-4bcd-90e8-335435ba59cf\") " pod="openstack/nova-api-0" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.187755 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29528202-42d5-4bcd-90e8-335435ba59cf-config-data\") pod \"nova-api-0\" (UID: \"29528202-42d5-4bcd-90e8-335435ba59cf\") " pod="openstack/nova-api-0" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.188536 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29528202-42d5-4bcd-90e8-335435ba59cf-logs\") pod \"nova-api-0\" (UID: \"29528202-42d5-4bcd-90e8-335435ba59cf\") " pod="openstack/nova-api-0" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.191572 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29528202-42d5-4bcd-90e8-335435ba59cf-internal-tls-certs\") pod \"nova-api-0\" (UID: \"29528202-42d5-4bcd-90e8-335435ba59cf\") " pod="openstack/nova-api-0" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.192178 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29528202-42d5-4bcd-90e8-335435ba59cf-config-data\") pod \"nova-api-0\" (UID: \"29528202-42d5-4bcd-90e8-335435ba59cf\") " pod="openstack/nova-api-0" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.192756 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29528202-42d5-4bcd-90e8-335435ba59cf-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"29528202-42d5-4bcd-90e8-335435ba59cf\") " pod="openstack/nova-api-0" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.192887 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29528202-42d5-4bcd-90e8-335435ba59cf-public-tls-certs\") pod \"nova-api-0\" (UID: \"29528202-42d5-4bcd-90e8-335435ba59cf\") " pod="openstack/nova-api-0" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.207160 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6htr8\" (UniqueName: \"kubernetes.io/projected/29528202-42d5-4bcd-90e8-335435ba59cf-kube-api-access-6htr8\") pod \"nova-api-0\" (UID: \"29528202-42d5-4bcd-90e8-335435ba59cf\") " pod="openstack/nova-api-0" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.394184 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.588568 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3be0a823-7437-40f0-977e-0ceab74013ea" path="/var/lib/kubelet/pods/3be0a823-7437-40f0-977e-0ceab74013ea/volumes" Feb 17 13:49:32 crc kubenswrapper[4804]: W0217 13:49:32.882779 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29528202_42d5_4bcd_90e8_335435ba59cf.slice/crio-7c3ef51a6bde88531304f52bedab6600736ab2313ce3ef16aeb7d0cee489a0b9 WatchSource:0}: Error finding container 7c3ef51a6bde88531304f52bedab6600736ab2313ce3ef16aeb7d0cee489a0b9: Status 404 returned error can't find the container with id 7c3ef51a6bde88531304f52bedab6600736ab2313ce3ef16aeb7d0cee489a0b9 Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.888005 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 13:49:32 crc kubenswrapper[4804]: I0217 13:49:32.990420 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"29528202-42d5-4bcd-90e8-335435ba59cf","Type":"ContainerStarted","Data":"7c3ef51a6bde88531304f52bedab6600736ab2313ce3ef16aeb7d0cee489a0b9"} Feb 17 13:49:34 crc kubenswrapper[4804]: I0217 13:49:34.001477 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"29528202-42d5-4bcd-90e8-335435ba59cf","Type":"ContainerStarted","Data":"d31055369e630c0356125fed33078d98b08a03ad8963cdb41e62d2a70ef41392"} Feb 17 13:49:34 crc kubenswrapper[4804]: I0217 13:49:34.001788 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"29528202-42d5-4bcd-90e8-335435ba59cf","Type":"ContainerStarted","Data":"ba5b77b47bab0c422fcd3b22593d9afb83fc8548133092014d8241ba7f652aae"} Feb 17 13:49:34 crc kubenswrapper[4804]: I0217 13:49:34.031701 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.031677995 podStartE2EDuration="2.031677995s" podCreationTimestamp="2026-02-17 13:49:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:49:34.019483231 +0000 UTC m=+1448.130902568" watchObservedRunningTime="2026-02-17 13:49:34.031677995 +0000 UTC m=+1448.143097332" Feb 17 13:49:34 crc kubenswrapper[4804]: I0217 13:49:34.353833 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="aa87191a-671d-43c8-b8c2-e5e07a54af02" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": read tcp 10.217.0.2:48052->10.217.0.197:8775: read: connection reset by peer" Feb 17 13:49:34 crc kubenswrapper[4804]: I0217 13:49:34.353834 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="aa87191a-671d-43c8-b8c2-e5e07a54af02" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": read tcp 10.217.0.2:48044->10.217.0.197:8775: read: connection reset by peer" Feb 17 13:49:34 crc kubenswrapper[4804]: I0217 13:49:34.835029 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 13:49:34 crc kubenswrapper[4804]: I0217 13:49:34.936682 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa87191a-671d-43c8-b8c2-e5e07a54af02-config-data\") pod \"aa87191a-671d-43c8-b8c2-e5e07a54af02\" (UID: \"aa87191a-671d-43c8-b8c2-e5e07a54af02\") " Feb 17 13:49:34 crc kubenswrapper[4804]: I0217 13:49:34.936753 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa87191a-671d-43c8-b8c2-e5e07a54af02-logs\") pod \"aa87191a-671d-43c8-b8c2-e5e07a54af02\" (UID: \"aa87191a-671d-43c8-b8c2-e5e07a54af02\") " Feb 17 13:49:34 crc kubenswrapper[4804]: I0217 13:49:34.936772 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa87191a-671d-43c8-b8c2-e5e07a54af02-combined-ca-bundle\") pod \"aa87191a-671d-43c8-b8c2-e5e07a54af02\" (UID: \"aa87191a-671d-43c8-b8c2-e5e07a54af02\") " Feb 17 13:49:34 crc kubenswrapper[4804]: I0217 13:49:34.936796 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqplt\" (UniqueName: \"kubernetes.io/projected/aa87191a-671d-43c8-b8c2-e5e07a54af02-kube-api-access-rqplt\") pod \"aa87191a-671d-43c8-b8c2-e5e07a54af02\" (UID: \"aa87191a-671d-43c8-b8c2-e5e07a54af02\") " Feb 17 13:49:34 crc kubenswrapper[4804]: I0217 13:49:34.936853 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa87191a-671d-43c8-b8c2-e5e07a54af02-nova-metadata-tls-certs\") pod \"aa87191a-671d-43c8-b8c2-e5e07a54af02\" (UID: \"aa87191a-671d-43c8-b8c2-e5e07a54af02\") " Feb 17 13:49:34 crc kubenswrapper[4804]: I0217 13:49:34.937401 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa87191a-671d-43c8-b8c2-e5e07a54af02-logs" (OuterVolumeSpecName: "logs") pod "aa87191a-671d-43c8-b8c2-e5e07a54af02" (UID: "aa87191a-671d-43c8-b8c2-e5e07a54af02"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:49:34 crc kubenswrapper[4804]: I0217 13:49:34.946149 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa87191a-671d-43c8-b8c2-e5e07a54af02-kube-api-access-rqplt" (OuterVolumeSpecName: "kube-api-access-rqplt") pod "aa87191a-671d-43c8-b8c2-e5e07a54af02" (UID: "aa87191a-671d-43c8-b8c2-e5e07a54af02"). InnerVolumeSpecName "kube-api-access-rqplt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:49:34 crc kubenswrapper[4804]: I0217 13:49:34.968623 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa87191a-671d-43c8-b8c2-e5e07a54af02-config-data" (OuterVolumeSpecName: "config-data") pod "aa87191a-671d-43c8-b8c2-e5e07a54af02" (UID: "aa87191a-671d-43c8-b8c2-e5e07a54af02"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:34 crc kubenswrapper[4804]: I0217 13:49:34.971598 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa87191a-671d-43c8-b8c2-e5e07a54af02-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa87191a-671d-43c8-b8c2-e5e07a54af02" (UID: "aa87191a-671d-43c8-b8c2-e5e07a54af02"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:34 crc kubenswrapper[4804]: I0217 13:49:34.980365 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 13:49:34 crc kubenswrapper[4804]: I0217 13:49:34.994847 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa87191a-671d-43c8-b8c2-e5e07a54af02-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "aa87191a-671d-43c8-b8c2-e5e07a54af02" (UID: "aa87191a-671d-43c8-b8c2-e5e07a54af02"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.028302 4804 generic.go:334] "Generic (PLEG): container finished" podID="aa87191a-671d-43c8-b8c2-e5e07a54af02" containerID="f14b58f1aab61a4641abd8e2099263a3e20347428156e830e60d2bb792f1efe5" exitCode=0 Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.028377 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aa87191a-671d-43c8-b8c2-e5e07a54af02","Type":"ContainerDied","Data":"f14b58f1aab61a4641abd8e2099263a3e20347428156e830e60d2bb792f1efe5"} Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.028408 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aa87191a-671d-43c8-b8c2-e5e07a54af02","Type":"ContainerDied","Data":"7c478cf9f4ed2e396f528fdf22823fcf8ddf5f04f1a3c2774ead4eafb4cdd61a"} Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.028429 4804 scope.go:117] "RemoveContainer" containerID="f14b58f1aab61a4641abd8e2099263a3e20347428156e830e60d2bb792f1efe5" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.028539 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.033451 4804 generic.go:334] "Generic (PLEG): container finished" podID="abf6d166-4c3f-4fb3-a3b5-f85d47adf823" containerID="74abf0adfaf756459596c8077a84021698f54ece715646599a6f465fcabf500a" exitCode=0 Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.034354 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.034474 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"abf6d166-4c3f-4fb3-a3b5-f85d47adf823","Type":"ContainerDied","Data":"74abf0adfaf756459596c8077a84021698f54ece715646599a6f465fcabf500a"} Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.034494 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"abf6d166-4c3f-4fb3-a3b5-f85d47adf823","Type":"ContainerDied","Data":"45d059e86e213177abef9a85b8685f82c73749ae5fde7098a9e718ebf9c0ae93"} Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.038782 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa87191a-671d-43c8-b8c2-e5e07a54af02-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.038813 4804 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa87191a-671d-43c8-b8c2-e5e07a54af02-logs\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.038825 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa87191a-671d-43c8-b8c2-e5e07a54af02-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.038838 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqplt\" (UniqueName: \"kubernetes.io/projected/aa87191a-671d-43c8-b8c2-e5e07a54af02-kube-api-access-rqplt\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.038851 4804 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa87191a-671d-43c8-b8c2-e5e07a54af02-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.053172 4804 scope.go:117] "RemoveContainer" containerID="10192fda379cac904a9add75091edd3fa4788176b8f596974a82c2b3ace68679" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.084815 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.094092 4804 scope.go:117] "RemoveContainer" containerID="f14b58f1aab61a4641abd8e2099263a3e20347428156e830e60d2bb792f1efe5" Feb 17 13:49:35 crc kubenswrapper[4804]: E0217 13:49:35.094559 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f14b58f1aab61a4641abd8e2099263a3e20347428156e830e60d2bb792f1efe5\": container with ID starting with f14b58f1aab61a4641abd8e2099263a3e20347428156e830e60d2bb792f1efe5 not found: ID does not exist" containerID="f14b58f1aab61a4641abd8e2099263a3e20347428156e830e60d2bb792f1efe5" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.094600 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f14b58f1aab61a4641abd8e2099263a3e20347428156e830e60d2bb792f1efe5"} err="failed to get container status \"f14b58f1aab61a4641abd8e2099263a3e20347428156e830e60d2bb792f1efe5\": rpc error: code = NotFound desc = could not find container \"f14b58f1aab61a4641abd8e2099263a3e20347428156e830e60d2bb792f1efe5\": container with ID starting with f14b58f1aab61a4641abd8e2099263a3e20347428156e830e60d2bb792f1efe5 not found: ID does not exist" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.094632 4804 scope.go:117] "RemoveContainer" containerID="10192fda379cac904a9add75091edd3fa4788176b8f596974a82c2b3ace68679" Feb 17 13:49:35 crc kubenswrapper[4804]: E0217 13:49:35.095616 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10192fda379cac904a9add75091edd3fa4788176b8f596974a82c2b3ace68679\": container with ID starting with 10192fda379cac904a9add75091edd3fa4788176b8f596974a82c2b3ace68679 not found: ID does not exist" containerID="10192fda379cac904a9add75091edd3fa4788176b8f596974a82c2b3ace68679" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.095649 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10192fda379cac904a9add75091edd3fa4788176b8f596974a82c2b3ace68679"} err="failed to get container status \"10192fda379cac904a9add75091edd3fa4788176b8f596974a82c2b3ace68679\": rpc error: code = NotFound desc = could not find container \"10192fda379cac904a9add75091edd3fa4788176b8f596974a82c2b3ace68679\": container with ID starting with 10192fda379cac904a9add75091edd3fa4788176b8f596974a82c2b3ace68679 not found: ID does not exist" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.095672 4804 scope.go:117] "RemoveContainer" containerID="74abf0adfaf756459596c8077a84021698f54ece715646599a6f465fcabf500a" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.099631 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.128997 4804 scope.go:117] "RemoveContainer" containerID="74abf0adfaf756459596c8077a84021698f54ece715646599a6f465fcabf500a" Feb 17 13:49:35 crc kubenswrapper[4804]: E0217 13:49:35.132536 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74abf0adfaf756459596c8077a84021698f54ece715646599a6f465fcabf500a\": container with ID starting with 74abf0adfaf756459596c8077a84021698f54ece715646599a6f465fcabf500a not found: ID does not exist" containerID="74abf0adfaf756459596c8077a84021698f54ece715646599a6f465fcabf500a" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.132578 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74abf0adfaf756459596c8077a84021698f54ece715646599a6f465fcabf500a"} err="failed to get container status \"74abf0adfaf756459596c8077a84021698f54ece715646599a6f465fcabf500a\": rpc error: code = NotFound desc = could not find container \"74abf0adfaf756459596c8077a84021698f54ece715646599a6f465fcabf500a\": container with ID starting with 74abf0adfaf756459596c8077a84021698f54ece715646599a6f465fcabf500a not found: ID does not exist" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.142889 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf6d166-4c3f-4fb3-a3b5-f85d47adf823-combined-ca-bundle\") pod \"abf6d166-4c3f-4fb3-a3b5-f85d47adf823\" (UID: \"abf6d166-4c3f-4fb3-a3b5-f85d47adf823\") " Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.142952 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f65hb\" (UniqueName: \"kubernetes.io/projected/abf6d166-4c3f-4fb3-a3b5-f85d47adf823-kube-api-access-f65hb\") pod \"abf6d166-4c3f-4fb3-a3b5-f85d47adf823\" (UID: \"abf6d166-4c3f-4fb3-a3b5-f85d47adf823\") " Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.143027 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf6d166-4c3f-4fb3-a3b5-f85d47adf823-config-data\") pod \"abf6d166-4c3f-4fb3-a3b5-f85d47adf823\" (UID: \"abf6d166-4c3f-4fb3-a3b5-f85d47adf823\") " Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.154374 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abf6d166-4c3f-4fb3-a3b5-f85d47adf823-kube-api-access-f65hb" (OuterVolumeSpecName: "kube-api-access-f65hb") pod "abf6d166-4c3f-4fb3-a3b5-f85d47adf823" (UID: "abf6d166-4c3f-4fb3-a3b5-f85d47adf823"). InnerVolumeSpecName "kube-api-access-f65hb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.156862 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:49:35 crc kubenswrapper[4804]: E0217 13:49:35.157921 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa87191a-671d-43c8-b8c2-e5e07a54af02" containerName="nova-metadata-log" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.157966 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa87191a-671d-43c8-b8c2-e5e07a54af02" containerName="nova-metadata-log" Feb 17 13:49:35 crc kubenswrapper[4804]: E0217 13:49:35.157977 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa87191a-671d-43c8-b8c2-e5e07a54af02" containerName="nova-metadata-metadata" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.157984 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa87191a-671d-43c8-b8c2-e5e07a54af02" containerName="nova-metadata-metadata" Feb 17 13:49:35 crc kubenswrapper[4804]: E0217 13:49:35.158011 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abf6d166-4c3f-4fb3-a3b5-f85d47adf823" containerName="nova-scheduler-scheduler" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.158039 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="abf6d166-4c3f-4fb3-a3b5-f85d47adf823" containerName="nova-scheduler-scheduler" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.158789 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa87191a-671d-43c8-b8c2-e5e07a54af02" containerName="nova-metadata-metadata" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.158825 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa87191a-671d-43c8-b8c2-e5e07a54af02" containerName="nova-metadata-log" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.158837 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="abf6d166-4c3f-4fb3-a3b5-f85d47adf823" containerName="nova-scheduler-scheduler" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.160731 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.162956 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.163009 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.166774 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abf6d166-4c3f-4fb3-a3b5-f85d47adf823-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "abf6d166-4c3f-4fb3-a3b5-f85d47adf823" (UID: "abf6d166-4c3f-4fb3-a3b5-f85d47adf823"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.170508 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.186152 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abf6d166-4c3f-4fb3-a3b5-f85d47adf823-config-data" (OuterVolumeSpecName: "config-data") pod "abf6d166-4c3f-4fb3-a3b5-f85d47adf823" (UID: "abf6d166-4c3f-4fb3-a3b5-f85d47adf823"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.245378 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf6d166-4c3f-4fb3-a3b5-f85d47adf823-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.245411 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f65hb\" (UniqueName: \"kubernetes.io/projected/abf6d166-4c3f-4fb3-a3b5-f85d47adf823-kube-api-access-f65hb\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.245421 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf6d166-4c3f-4fb3-a3b5-f85d47adf823-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.347425 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee4c15c1-5fb0-4605-9cb8-69a060ec0d39-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ee4c15c1-5fb0-4605-9cb8-69a060ec0d39\") " pod="openstack/nova-metadata-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.347492 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee4c15c1-5fb0-4605-9cb8-69a060ec0d39-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ee4c15c1-5fb0-4605-9cb8-69a060ec0d39\") " pod="openstack/nova-metadata-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.347828 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95qwz\" (UniqueName: \"kubernetes.io/projected/ee4c15c1-5fb0-4605-9cb8-69a060ec0d39-kube-api-access-95qwz\") pod \"nova-metadata-0\" (UID: \"ee4c15c1-5fb0-4605-9cb8-69a060ec0d39\") " pod="openstack/nova-metadata-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.347953 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee4c15c1-5fb0-4605-9cb8-69a060ec0d39-logs\") pod \"nova-metadata-0\" (UID: \"ee4c15c1-5fb0-4605-9cb8-69a060ec0d39\") " pod="openstack/nova-metadata-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.348113 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee4c15c1-5fb0-4605-9cb8-69a060ec0d39-config-data\") pod \"nova-metadata-0\" (UID: \"ee4c15c1-5fb0-4605-9cb8-69a060ec0d39\") " pod="openstack/nova-metadata-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.374935 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.397806 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.404445 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.406129 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.409287 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.415858 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.449909 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95qwz\" (UniqueName: \"kubernetes.io/projected/ee4c15c1-5fb0-4605-9cb8-69a060ec0d39-kube-api-access-95qwz\") pod \"nova-metadata-0\" (UID: \"ee4c15c1-5fb0-4605-9cb8-69a060ec0d39\") " pod="openstack/nova-metadata-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.449965 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee4c15c1-5fb0-4605-9cb8-69a060ec0d39-logs\") pod \"nova-metadata-0\" (UID: \"ee4c15c1-5fb0-4605-9cb8-69a060ec0d39\") " pod="openstack/nova-metadata-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.450004 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee4c15c1-5fb0-4605-9cb8-69a060ec0d39-config-data\") pod \"nova-metadata-0\" (UID: \"ee4c15c1-5fb0-4605-9cb8-69a060ec0d39\") " pod="openstack/nova-metadata-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.450054 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee4c15c1-5fb0-4605-9cb8-69a060ec0d39-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ee4c15c1-5fb0-4605-9cb8-69a060ec0d39\") " pod="openstack/nova-metadata-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.450077 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee4c15c1-5fb0-4605-9cb8-69a060ec0d39-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ee4c15c1-5fb0-4605-9cb8-69a060ec0d39\") " pod="openstack/nova-metadata-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.450808 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee4c15c1-5fb0-4605-9cb8-69a060ec0d39-logs\") pod \"nova-metadata-0\" (UID: \"ee4c15c1-5fb0-4605-9cb8-69a060ec0d39\") " pod="openstack/nova-metadata-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.454332 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee4c15c1-5fb0-4605-9cb8-69a060ec0d39-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ee4c15c1-5fb0-4605-9cb8-69a060ec0d39\") " pod="openstack/nova-metadata-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.454425 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee4c15c1-5fb0-4605-9cb8-69a060ec0d39-config-data\") pod \"nova-metadata-0\" (UID: \"ee4c15c1-5fb0-4605-9cb8-69a060ec0d39\") " pod="openstack/nova-metadata-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.455938 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee4c15c1-5fb0-4605-9cb8-69a060ec0d39-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ee4c15c1-5fb0-4605-9cb8-69a060ec0d39\") " pod="openstack/nova-metadata-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.465303 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95qwz\" (UniqueName: \"kubernetes.io/projected/ee4c15c1-5fb0-4605-9cb8-69a060ec0d39-kube-api-access-95qwz\") pod \"nova-metadata-0\" (UID: \"ee4c15c1-5fb0-4605-9cb8-69a060ec0d39\") " pod="openstack/nova-metadata-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.478483 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.552210 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4zqm\" (UniqueName: \"kubernetes.io/projected/1bac289d-58a7-4e23-8805-c48811d12d32-kube-api-access-d4zqm\") pod \"nova-scheduler-0\" (UID: \"1bac289d-58a7-4e23-8805-c48811d12d32\") " pod="openstack/nova-scheduler-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.552287 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bac289d-58a7-4e23-8805-c48811d12d32-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1bac289d-58a7-4e23-8805-c48811d12d32\") " pod="openstack/nova-scheduler-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.552350 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bac289d-58a7-4e23-8805-c48811d12d32-config-data\") pod \"nova-scheduler-0\" (UID: \"1bac289d-58a7-4e23-8805-c48811d12d32\") " pod="openstack/nova-scheduler-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.655389 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4zqm\" (UniqueName: \"kubernetes.io/projected/1bac289d-58a7-4e23-8805-c48811d12d32-kube-api-access-d4zqm\") pod \"nova-scheduler-0\" (UID: \"1bac289d-58a7-4e23-8805-c48811d12d32\") " pod="openstack/nova-scheduler-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.655752 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bac289d-58a7-4e23-8805-c48811d12d32-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1bac289d-58a7-4e23-8805-c48811d12d32\") " pod="openstack/nova-scheduler-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.655834 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bac289d-58a7-4e23-8805-c48811d12d32-config-data\") pod \"nova-scheduler-0\" (UID: \"1bac289d-58a7-4e23-8805-c48811d12d32\") " pod="openstack/nova-scheduler-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.669538 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bac289d-58a7-4e23-8805-c48811d12d32-config-data\") pod \"nova-scheduler-0\" (UID: \"1bac289d-58a7-4e23-8805-c48811d12d32\") " pod="openstack/nova-scheduler-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.671410 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bac289d-58a7-4e23-8805-c48811d12d32-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1bac289d-58a7-4e23-8805-c48811d12d32\") " pod="openstack/nova-scheduler-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.676996 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4zqm\" (UniqueName: \"kubernetes.io/projected/1bac289d-58a7-4e23-8805-c48811d12d32-kube-api-access-d4zqm\") pod \"nova-scheduler-0\" (UID: \"1bac289d-58a7-4e23-8805-c48811d12d32\") " pod="openstack/nova-scheduler-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.871845 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 13:49:35 crc kubenswrapper[4804]: I0217 13:49:35.949131 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 13:49:36 crc kubenswrapper[4804]: I0217 13:49:36.055104 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ee4c15c1-5fb0-4605-9cb8-69a060ec0d39","Type":"ContainerStarted","Data":"3cf3bf2971bbe3ac1ef8d149181e32f777d990eb9bd36af317d8f33ac12551c2"} Feb 17 13:49:36 crc kubenswrapper[4804]: I0217 13:49:36.297746 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 13:49:36 crc kubenswrapper[4804]: W0217 13:49:36.300750 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1bac289d_58a7_4e23_8805_c48811d12d32.slice/crio-e2772258ef70f8993eed183548129d8e4a515ce477a60fa2023f890b7216f2fb WatchSource:0}: Error finding container e2772258ef70f8993eed183548129d8e4a515ce477a60fa2023f890b7216f2fb: Status 404 returned error can't find the container with id e2772258ef70f8993eed183548129d8e4a515ce477a60fa2023f890b7216f2fb Feb 17 13:49:36 crc kubenswrapper[4804]: I0217 13:49:36.595246 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa87191a-671d-43c8-b8c2-e5e07a54af02" path="/var/lib/kubelet/pods/aa87191a-671d-43c8-b8c2-e5e07a54af02/volumes" Feb 17 13:49:36 crc kubenswrapper[4804]: I0217 13:49:36.596886 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abf6d166-4c3f-4fb3-a3b5-f85d47adf823" path="/var/lib/kubelet/pods/abf6d166-4c3f-4fb3-a3b5-f85d47adf823/volumes" Feb 17 13:49:37 crc kubenswrapper[4804]: I0217 13:49:37.069254 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ee4c15c1-5fb0-4605-9cb8-69a060ec0d39","Type":"ContainerStarted","Data":"a4494d51ea372199bc15d386c17bb86f13e7d289216155b8a43f96e65e292a84"} Feb 17 13:49:37 crc kubenswrapper[4804]: I0217 13:49:37.069299 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ee4c15c1-5fb0-4605-9cb8-69a060ec0d39","Type":"ContainerStarted","Data":"eb07a7999853b3bec5d813d4c50bac369e3d9e4b6610762803ac28d7fb7b54bf"} Feb 17 13:49:37 crc kubenswrapper[4804]: I0217 13:49:37.072558 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1bac289d-58a7-4e23-8805-c48811d12d32","Type":"ContainerStarted","Data":"dbf6c34cab2daa4c39572582f1eca7e5d0a1054b839d806faab4284e194858f0"} Feb 17 13:49:37 crc kubenswrapper[4804]: I0217 13:49:37.072582 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1bac289d-58a7-4e23-8805-c48811d12d32","Type":"ContainerStarted","Data":"e2772258ef70f8993eed183548129d8e4a515ce477a60fa2023f890b7216f2fb"} Feb 17 13:49:37 crc kubenswrapper[4804]: I0217 13:49:37.099272 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.099254039 podStartE2EDuration="2.099254039s" podCreationTimestamp="2026-02-17 13:49:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:49:37.091249708 +0000 UTC m=+1451.202669045" watchObservedRunningTime="2026-02-17 13:49:37.099254039 +0000 UTC m=+1451.210673376" Feb 17 13:49:37 crc kubenswrapper[4804]: I0217 13:49:37.123373 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.123346497 podStartE2EDuration="2.123346497s" podCreationTimestamp="2026-02-17 13:49:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:49:37.114249031 +0000 UTC m=+1451.225668408" watchObservedRunningTime="2026-02-17 13:49:37.123346497 +0000 UTC m=+1451.234765874" Feb 17 13:49:40 crc kubenswrapper[4804]: I0217 13:49:40.479705 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 13:49:40 crc kubenswrapper[4804]: I0217 13:49:40.480590 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 13:49:40 crc kubenswrapper[4804]: I0217 13:49:40.872867 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 17 13:49:42 crc kubenswrapper[4804]: I0217 13:49:42.395173 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 13:49:42 crc kubenswrapper[4804]: I0217 13:49:42.395284 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 13:49:43 crc kubenswrapper[4804]: I0217 13:49:43.410526 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="29528202-42d5-4bcd-90e8-335435ba59cf" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 13:49:43 crc kubenswrapper[4804]: I0217 13:49:43.410637 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="29528202-42d5-4bcd-90e8-335435ba59cf" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 13:49:45 crc kubenswrapper[4804]: I0217 13:49:45.478990 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 17 13:49:45 crc kubenswrapper[4804]: I0217 13:49:45.480740 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 17 13:49:45 crc kubenswrapper[4804]: I0217 13:49:45.872754 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 17 13:49:45 crc kubenswrapper[4804]: I0217 13:49:45.917965 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 17 13:49:46 crc kubenswrapper[4804]: I0217 13:49:46.214471 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 17 13:49:46 crc kubenswrapper[4804]: I0217 13:49:46.495474 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ee4c15c1-5fb0-4605-9cb8-69a060ec0d39" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 13:49:46 crc kubenswrapper[4804]: I0217 13:49:46.495473 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ee4c15c1-5fb0-4605-9cb8-69a060ec0d39" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 13:49:51 crc kubenswrapper[4804]: I0217 13:49:51.283779 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 17 13:49:52 crc kubenswrapper[4804]: I0217 13:49:52.428292 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 17 13:49:52 crc kubenswrapper[4804]: I0217 13:49:52.428801 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 17 13:49:52 crc kubenswrapper[4804]: I0217 13:49:52.433910 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 17 13:49:52 crc kubenswrapper[4804]: I0217 13:49:52.436062 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 17 13:49:53 crc kubenswrapper[4804]: I0217 13:49:53.235106 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 17 13:49:53 crc kubenswrapper[4804]: I0217 13:49:53.241247 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 17 13:49:55 crc kubenswrapper[4804]: I0217 13:49:55.486534 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 17 13:49:55 crc kubenswrapper[4804]: I0217 13:49:55.486950 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 17 13:49:55 crc kubenswrapper[4804]: I0217 13:49:55.505926 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 17 13:49:55 crc kubenswrapper[4804]: I0217 13:49:55.512961 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 17 13:49:55 crc kubenswrapper[4804]: I0217 13:49:55.835918 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:49:55 crc kubenswrapper[4804]: I0217 13:49:55.836021 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:49:55 crc kubenswrapper[4804]: I0217 13:49:55.836100 4804 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:49:55 crc kubenswrapper[4804]: I0217 13:49:55.837169 4804 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"845afb2d1d32c8f1c4420bf9c6d30ae92d7fd53810dea6b094c1e266f88044e6"} pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 13:49:55 crc kubenswrapper[4804]: I0217 13:49:55.837312 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" containerID="cri-o://845afb2d1d32c8f1c4420bf9c6d30ae92d7fd53810dea6b094c1e266f88044e6" gracePeriod=600 Feb 17 13:49:56 crc kubenswrapper[4804]: I0217 13:49:56.271834 4804 generic.go:334] "Generic (PLEG): container finished" podID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerID="845afb2d1d32c8f1c4420bf9c6d30ae92d7fd53810dea6b094c1e266f88044e6" exitCode=0 Feb 17 13:49:56 crc kubenswrapper[4804]: I0217 13:49:56.271913 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerDied","Data":"845afb2d1d32c8f1c4420bf9c6d30ae92d7fd53810dea6b094c1e266f88044e6"} Feb 17 13:49:56 crc kubenswrapper[4804]: I0217 13:49:56.272510 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerStarted","Data":"174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687"} Feb 17 13:49:56 crc kubenswrapper[4804]: I0217 13:49:56.272557 4804 scope.go:117] "RemoveContainer" containerID="ea496523757895e32a1aaa278701aae787ee0314e5bf63e36e8c688fc2dbc0d7" Feb 17 13:50:05 crc kubenswrapper[4804]: I0217 13:50:05.796772 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 13:50:07 crc kubenswrapper[4804]: I0217 13:50:07.317788 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 13:50:09 crc kubenswrapper[4804]: I0217 13:50:09.892044 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="7705a06d-bc27-4686-9ca4-4aae248ead07" containerName="rabbitmq" containerID="cri-o://d5fe6af3fa561d599b0a9547e4be296807d6c08d75785f8fe237167c1e1109a3" gracePeriod=604796 Feb 17 13:50:11 crc kubenswrapper[4804]: I0217 13:50:11.203013 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" containerName="rabbitmq" containerID="cri-o://e223242f2f9a06365d51771062ed7df23bbf7ec9bda6057f41d25fb9aed813cb" gracePeriod=604797 Feb 17 13:50:11 crc kubenswrapper[4804]: I0217 13:50:11.499160 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="7705a06d-bc27-4686-9ca4-4aae248ead07" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.97:5671: connect: connection refused" Feb 17 13:50:11 crc kubenswrapper[4804]: I0217 13:50:11.626389 4804 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.98:5671: connect: connection refused" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.476950 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.493915 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7705a06d-bc27-4686-9ca4-4aae248ead07-plugins-conf\") pod \"7705a06d-bc27-4686-9ca4-4aae248ead07\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.493988 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7705a06d-bc27-4686-9ca4-4aae248ead07-pod-info\") pod \"7705a06d-bc27-4686-9ca4-4aae248ead07\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.494135 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7705a06d-bc27-4686-9ca4-4aae248ead07-server-conf\") pod \"7705a06d-bc27-4686-9ca4-4aae248ead07\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.494173 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7705a06d-bc27-4686-9ca4-4aae248ead07-erlang-cookie-secret\") pod \"7705a06d-bc27-4686-9ca4-4aae248ead07\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.494296 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"7705a06d-bc27-4686-9ca4-4aae248ead07\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.494378 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-confd\") pod \"7705a06d-bc27-4686-9ca4-4aae248ead07\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.494413 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-erlang-cookie\") pod \"7705a06d-bc27-4686-9ca4-4aae248ead07\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.494465 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-tls\") pod \"7705a06d-bc27-4686-9ca4-4aae248ead07\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.494526 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrqs7\" (UniqueName: \"kubernetes.io/projected/7705a06d-bc27-4686-9ca4-4aae248ead07-kube-api-access-qrqs7\") pod \"7705a06d-bc27-4686-9ca4-4aae248ead07\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.494566 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7705a06d-bc27-4686-9ca4-4aae248ead07-config-data\") pod \"7705a06d-bc27-4686-9ca4-4aae248ead07\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.494646 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-plugins\") pod \"7705a06d-bc27-4686-9ca4-4aae248ead07\" (UID: \"7705a06d-bc27-4686-9ca4-4aae248ead07\") " Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.494741 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7705a06d-bc27-4686-9ca4-4aae248ead07-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "7705a06d-bc27-4686-9ca4-4aae248ead07" (UID: "7705a06d-bc27-4686-9ca4-4aae248ead07"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.495213 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "7705a06d-bc27-4686-9ca4-4aae248ead07" (UID: "7705a06d-bc27-4686-9ca4-4aae248ead07"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.495733 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "7705a06d-bc27-4686-9ca4-4aae248ead07" (UID: "7705a06d-bc27-4686-9ca4-4aae248ead07"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.502850 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/7705a06d-bc27-4686-9ca4-4aae248ead07-pod-info" (OuterVolumeSpecName: "pod-info") pod "7705a06d-bc27-4686-9ca4-4aae248ead07" (UID: "7705a06d-bc27-4686-9ca4-4aae248ead07"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.502890 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "persistence") pod "7705a06d-bc27-4686-9ca4-4aae248ead07" (UID: "7705a06d-bc27-4686-9ca4-4aae248ead07"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.503120 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7705a06d-bc27-4686-9ca4-4aae248ead07-kube-api-access-qrqs7" (OuterVolumeSpecName: "kube-api-access-qrqs7") pod "7705a06d-bc27-4686-9ca4-4aae248ead07" (UID: "7705a06d-bc27-4686-9ca4-4aae248ead07"). InnerVolumeSpecName "kube-api-access-qrqs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.503875 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7705a06d-bc27-4686-9ca4-4aae248ead07-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "7705a06d-bc27-4686-9ca4-4aae248ead07" (UID: "7705a06d-bc27-4686-9ca4-4aae248ead07"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.512608 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "7705a06d-bc27-4686-9ca4-4aae248ead07" (UID: "7705a06d-bc27-4686-9ca4-4aae248ead07"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.576897 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7705a06d-bc27-4686-9ca4-4aae248ead07-config-data" (OuterVolumeSpecName: "config-data") pod "7705a06d-bc27-4686-9ca4-4aae248ead07" (UID: "7705a06d-bc27-4686-9ca4-4aae248ead07"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.577861 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7705a06d-bc27-4686-9ca4-4aae248ead07-server-conf" (OuterVolumeSpecName: "server-conf") pod "7705a06d-bc27-4686-9ca4-4aae248ead07" (UID: "7705a06d-bc27-4686-9ca4-4aae248ead07"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.599271 4804 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.599310 4804 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.599322 4804 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.599331 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrqs7\" (UniqueName: \"kubernetes.io/projected/7705a06d-bc27-4686-9ca4-4aae248ead07-kube-api-access-qrqs7\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.599339 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7705a06d-bc27-4686-9ca4-4aae248ead07-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.599348 4804 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.603046 4804 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7705a06d-bc27-4686-9ca4-4aae248ead07-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.603062 4804 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7705a06d-bc27-4686-9ca4-4aae248ead07-pod-info\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.603101 4804 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7705a06d-bc27-4686-9ca4-4aae248ead07-server-conf\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.603113 4804 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7705a06d-bc27-4686-9ca4-4aae248ead07-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.614045 4804 generic.go:334] "Generic (PLEG): container finished" podID="7705a06d-bc27-4686-9ca4-4aae248ead07" containerID="d5fe6af3fa561d599b0a9547e4be296807d6c08d75785f8fe237167c1e1109a3" exitCode=0 Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.614089 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7705a06d-bc27-4686-9ca4-4aae248ead07","Type":"ContainerDied","Data":"d5fe6af3fa561d599b0a9547e4be296807d6c08d75785f8fe237167c1e1109a3"} Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.614095 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.614117 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7705a06d-bc27-4686-9ca4-4aae248ead07","Type":"ContainerDied","Data":"1805a02bed1d8e8fe42a7072ff53aa627c043f3fc1570707e67a0dbc0d5ed7c3"} Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.614136 4804 scope.go:117] "RemoveContainer" containerID="d5fe6af3fa561d599b0a9547e4be296807d6c08d75785f8fe237167c1e1109a3" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.639778 4804 scope.go:117] "RemoveContainer" containerID="762e45a264eaebaf3d8f6d5ce0dcb58c8497b5e731fde55858df441e4a039993" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.660118 4804 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.674240 4804 scope.go:117] "RemoveContainer" containerID="d5fe6af3fa561d599b0a9547e4be296807d6c08d75785f8fe237167c1e1109a3" Feb 17 13:50:16 crc kubenswrapper[4804]: E0217 13:50:16.684335 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5fe6af3fa561d599b0a9547e4be296807d6c08d75785f8fe237167c1e1109a3\": container with ID starting with d5fe6af3fa561d599b0a9547e4be296807d6c08d75785f8fe237167c1e1109a3 not found: ID does not exist" containerID="d5fe6af3fa561d599b0a9547e4be296807d6c08d75785f8fe237167c1e1109a3" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.684377 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5fe6af3fa561d599b0a9547e4be296807d6c08d75785f8fe237167c1e1109a3"} err="failed to get container status \"d5fe6af3fa561d599b0a9547e4be296807d6c08d75785f8fe237167c1e1109a3\": rpc error: code = NotFound desc = could not find container \"d5fe6af3fa561d599b0a9547e4be296807d6c08d75785f8fe237167c1e1109a3\": container with ID starting with d5fe6af3fa561d599b0a9547e4be296807d6c08d75785f8fe237167c1e1109a3 not found: ID does not exist" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.684400 4804 scope.go:117] "RemoveContainer" containerID="762e45a264eaebaf3d8f6d5ce0dcb58c8497b5e731fde55858df441e4a039993" Feb 17 13:50:16 crc kubenswrapper[4804]: E0217 13:50:16.684818 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"762e45a264eaebaf3d8f6d5ce0dcb58c8497b5e731fde55858df441e4a039993\": container with ID starting with 762e45a264eaebaf3d8f6d5ce0dcb58c8497b5e731fde55858df441e4a039993 not found: ID does not exist" containerID="762e45a264eaebaf3d8f6d5ce0dcb58c8497b5e731fde55858df441e4a039993" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.684836 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"762e45a264eaebaf3d8f6d5ce0dcb58c8497b5e731fde55858df441e4a039993"} err="failed to get container status \"762e45a264eaebaf3d8f6d5ce0dcb58c8497b5e731fde55858df441e4a039993\": rpc error: code = NotFound desc = could not find container \"762e45a264eaebaf3d8f6d5ce0dcb58c8497b5e731fde55858df441e4a039993\": container with ID starting with 762e45a264eaebaf3d8f6d5ce0dcb58c8497b5e731fde55858df441e4a039993 not found: ID does not exist" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.704580 4804 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.710894 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "7705a06d-bc27-4686-9ca4-4aae248ead07" (UID: "7705a06d-bc27-4686-9ca4-4aae248ead07"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.806131 4804 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7705a06d-bc27-4686-9ca4-4aae248ead07-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.969441 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.979323 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.987958 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 13:50:16 crc kubenswrapper[4804]: E0217 13:50:16.988367 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7705a06d-bc27-4686-9ca4-4aae248ead07" containerName="rabbitmq" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.988386 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="7705a06d-bc27-4686-9ca4-4aae248ead07" containerName="rabbitmq" Feb 17 13:50:16 crc kubenswrapper[4804]: E0217 13:50:16.988403 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7705a06d-bc27-4686-9ca4-4aae248ead07" containerName="setup-container" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.988410 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="7705a06d-bc27-4686-9ca4-4aae248ead07" containerName="setup-container" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.988612 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="7705a06d-bc27-4686-9ca4-4aae248ead07" containerName="rabbitmq" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.989598 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.991915 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.992091 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.992250 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.992414 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.995519 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-cxlcf" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.995519 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 17 13:50:16 crc kubenswrapper[4804]: I0217 13:50:16.995816 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.003534 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.009427 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b5f204e4-3b7a-4490-9c78-def5bf30f810-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.009722 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b5f204e4-3b7a-4490-9c78-def5bf30f810-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.009846 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b5f204e4-3b7a-4490-9c78-def5bf30f810-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.009964 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b5f204e4-3b7a-4490-9c78-def5bf30f810-config-data\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.010050 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b5f204e4-3b7a-4490-9c78-def5bf30f810-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.010161 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b5f204e4-3b7a-4490-9c78-def5bf30f810-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.010310 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b5f204e4-3b7a-4490-9c78-def5bf30f810-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.010392 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b5f204e4-3b7a-4490-9c78-def5bf30f810-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.010507 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.010583 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lz2n\" (UniqueName: \"kubernetes.io/projected/b5f204e4-3b7a-4490-9c78-def5bf30f810-kube-api-access-2lz2n\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.010663 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b5f204e4-3b7a-4490-9c78-def5bf30f810-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.112640 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b5f204e4-3b7a-4490-9c78-def5bf30f810-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.112702 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b5f204e4-3b7a-4490-9c78-def5bf30f810-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.112747 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b5f204e4-3b7a-4490-9c78-def5bf30f810-config-data\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.112780 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b5f204e4-3b7a-4490-9c78-def5bf30f810-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.112803 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b5f204e4-3b7a-4490-9c78-def5bf30f810-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.112843 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b5f204e4-3b7a-4490-9c78-def5bf30f810-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.112914 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b5f204e4-3b7a-4490-9c78-def5bf30f810-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.112980 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.113003 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lz2n\" (UniqueName: \"kubernetes.io/projected/b5f204e4-3b7a-4490-9c78-def5bf30f810-kube-api-access-2lz2n\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.113022 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b5f204e4-3b7a-4490-9c78-def5bf30f810-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.113099 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b5f204e4-3b7a-4490-9c78-def5bf30f810-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.113796 4804 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.113899 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b5f204e4-3b7a-4490-9c78-def5bf30f810-config-data\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.116366 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b5f204e4-3b7a-4490-9c78-def5bf30f810-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.117460 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b5f204e4-3b7a-4490-9c78-def5bf30f810-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.118733 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b5f204e4-3b7a-4490-9c78-def5bf30f810-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.119751 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b5f204e4-3b7a-4490-9c78-def5bf30f810-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.119752 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b5f204e4-3b7a-4490-9c78-def5bf30f810-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.119880 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b5f204e4-3b7a-4490-9c78-def5bf30f810-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.121497 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b5f204e4-3b7a-4490-9c78-def5bf30f810-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.126396 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b5f204e4-3b7a-4490-9c78-def5bf30f810-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.131936 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lz2n\" (UniqueName: \"kubernetes.io/projected/b5f204e4-3b7a-4490-9c78-def5bf30f810-kube-api-access-2lz2n\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.152142 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"b5f204e4-3b7a-4490-9c78-def5bf30f810\") " pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.323049 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.625760 4804 generic.go:334] "Generic (PLEG): container finished" podID="bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" containerID="e223242f2f9a06365d51771062ed7df23bbf7ec9bda6057f41d25fb9aed813cb" exitCode=0 Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.626007 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad","Type":"ContainerDied","Data":"e223242f2f9a06365d51771062ed7df23bbf7ec9bda6057f41d25fb9aed813cb"} Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.776706 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.856770 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.962873 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxh4q\" (UniqueName: \"kubernetes.io/projected/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-kube-api-access-cxh4q\") pod \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.962937 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-config-data\") pod \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.962974 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.963001 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-erlang-cookie\") pod \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.963027 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-confd\") pod \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.963055 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-plugins-conf\") pod \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.963114 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-plugins\") pod \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.963155 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-pod-info\") pod \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.963179 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-server-conf\") pod \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.963223 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-tls\") pod \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.963263 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-erlang-cookie-secret\") pod \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\" (UID: \"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad\") " Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.964347 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" (UID: "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.964373 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" (UID: "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.964580 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" (UID: "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.968546 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-kube-api-access-cxh4q" (OuterVolumeSpecName: "kube-api-access-cxh4q") pod "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" (UID: "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad"). InnerVolumeSpecName "kube-api-access-cxh4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.969761 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" (UID: "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.970908 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" (UID: "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.971174 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-pod-info" (OuterVolumeSpecName: "pod-info") pod "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" (UID: "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 17 13:50:17 crc kubenswrapper[4804]: I0217 13:50:17.973281 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" (UID: "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.009542 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-config-data" (OuterVolumeSpecName: "config-data") pod "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" (UID: "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.024973 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-server-conf" (OuterVolumeSpecName: "server-conf") pod "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" (UID: "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.064707 4804 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.064744 4804 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-pod-info\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.064753 4804 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-server-conf\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.064760 4804 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.064769 4804 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.064777 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxh4q\" (UniqueName: \"kubernetes.io/projected/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-kube-api-access-cxh4q\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.064787 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.064811 4804 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.064820 4804 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.064830 4804 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.079994 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" (UID: "bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.083420 4804 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.166970 4804 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.167286 4804 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.586321 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7705a06d-bc27-4686-9ca4-4aae248ead07" path="/var/lib/kubelet/pods/7705a06d-bc27-4686-9ca4-4aae248ead07/volumes" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.638720 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b5f204e4-3b7a-4490-9c78-def5bf30f810","Type":"ContainerStarted","Data":"99bb36612b47257c670c0b1cca228d9c758565ee95e73e125105263015fbc589"} Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.641209 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad","Type":"ContainerDied","Data":"3d33f0752018a1f8bfeaf3539a14d45be119615613d9a1b94e290b0a39b198ee"} Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.641266 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.641288 4804 scope.go:117] "RemoveContainer" containerID="e223242f2f9a06365d51771062ed7df23bbf7ec9bda6057f41d25fb9aed813cb" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.673309 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.673974 4804 scope.go:117] "RemoveContainer" containerID="de02dbb74f45601647c918b390d5f93cfff604870702fca3316aca846c6db162" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.702395 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.713502 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 13:50:18 crc kubenswrapper[4804]: E0217 13:50:18.714018 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" containerName="rabbitmq" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.714032 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" containerName="rabbitmq" Feb 17 13:50:18 crc kubenswrapper[4804]: E0217 13:50:18.714045 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" containerName="setup-container" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.714052 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" containerName="setup-container" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.714282 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" containerName="rabbitmq" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.715293 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.723389 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.723434 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.723480 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.724022 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-m99n4" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.724254 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.724428 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.724669 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.728348 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.879693 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4c7ecd09-cd15-439d-9153-b55d9013bb83-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.879854 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c7ecd09-cd15-439d-9153-b55d9013bb83-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.879925 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4c7ecd09-cd15-439d-9153-b55d9013bb83-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.880064 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4c7ecd09-cd15-439d-9153-b55d9013bb83-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.880258 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4c7ecd09-cd15-439d-9153-b55d9013bb83-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.880433 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65xc8\" (UniqueName: \"kubernetes.io/projected/4c7ecd09-cd15-439d-9153-b55d9013bb83-kube-api-access-65xc8\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.880482 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4c7ecd09-cd15-439d-9153-b55d9013bb83-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.880673 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4c7ecd09-cd15-439d-9153-b55d9013bb83-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.880824 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4c7ecd09-cd15-439d-9153-b55d9013bb83-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.880980 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4c7ecd09-cd15-439d-9153-b55d9013bb83-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.881012 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.982932 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4c7ecd09-cd15-439d-9153-b55d9013bb83-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.983027 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4c7ecd09-cd15-439d-9153-b55d9013bb83-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.983055 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.983097 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4c7ecd09-cd15-439d-9153-b55d9013bb83-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.983448 4804 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.984260 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4c7ecd09-cd15-439d-9153-b55d9013bb83-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.984813 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c7ecd09-cd15-439d-9153-b55d9013bb83-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.984915 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4c7ecd09-cd15-439d-9153-b55d9013bb83-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.988319 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c7ecd09-cd15-439d-9153-b55d9013bb83-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.988410 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4c7ecd09-cd15-439d-9153-b55d9013bb83-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.988507 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4c7ecd09-cd15-439d-9153-b55d9013bb83-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.988629 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4c7ecd09-cd15-439d-9153-b55d9013bb83-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.988712 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65xc8\" (UniqueName: \"kubernetes.io/projected/4c7ecd09-cd15-439d-9153-b55d9013bb83-kube-api-access-65xc8\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.988736 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4c7ecd09-cd15-439d-9153-b55d9013bb83-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.988812 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4c7ecd09-cd15-439d-9153-b55d9013bb83-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.989257 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4c7ecd09-cd15-439d-9153-b55d9013bb83-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.990183 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4c7ecd09-cd15-439d-9153-b55d9013bb83-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.992290 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4c7ecd09-cd15-439d-9153-b55d9013bb83-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.992892 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4c7ecd09-cd15-439d-9153-b55d9013bb83-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:18 crc kubenswrapper[4804]: I0217 13:50:18.998789 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4c7ecd09-cd15-439d-9153-b55d9013bb83-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:19 crc kubenswrapper[4804]: I0217 13:50:19.015178 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65xc8\" (UniqueName: \"kubernetes.io/projected/4c7ecd09-cd15-439d-9153-b55d9013bb83-kube-api-access-65xc8\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:19 crc kubenswrapper[4804]: I0217 13:50:19.016626 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4c7ecd09-cd15-439d-9153-b55d9013bb83-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:19 crc kubenswrapper[4804]: I0217 13:50:19.049772 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4c7ecd09-cd15-439d-9153-b55d9013bb83\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:19 crc kubenswrapper[4804]: I0217 13:50:19.347926 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:19 crc kubenswrapper[4804]: I0217 13:50:19.653283 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b5f204e4-3b7a-4490-9c78-def5bf30f810","Type":"ContainerStarted","Data":"bbcf95a398f7b865c615ecb45334f7dffb36eeee0c13f1cb51c751c688537e45"} Feb 17 13:50:19 crc kubenswrapper[4804]: I0217 13:50:19.876335 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 13:50:19 crc kubenswrapper[4804]: W0217 13:50:19.879034 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c7ecd09_cd15_439d_9153_b55d9013bb83.slice/crio-43fd8e308b41018c1084f87d8cf2d00eef3a0862921602fb8c2e4cd3b76b90d7 WatchSource:0}: Error finding container 43fd8e308b41018c1084f87d8cf2d00eef3a0862921602fb8c2e4cd3b76b90d7: Status 404 returned error can't find the container with id 43fd8e308b41018c1084f87d8cf2d00eef3a0862921602fb8c2e4cd3b76b90d7 Feb 17 13:50:20 crc kubenswrapper[4804]: I0217 13:50:20.586950 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad" path="/var/lib/kubelet/pods/bc485c5b-1bf7-473f-b5b0-a55d5dd0e2ad/volumes" Feb 17 13:50:20 crc kubenswrapper[4804]: I0217 13:50:20.670328 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4c7ecd09-cd15-439d-9153-b55d9013bb83","Type":"ContainerStarted","Data":"43fd8e308b41018c1084f87d8cf2d00eef3a0862921602fb8c2e4cd3b76b90d7"} Feb 17 13:50:20 crc kubenswrapper[4804]: I0217 13:50:20.863035 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d558885bc-h2xmr"] Feb 17 13:50:20 crc kubenswrapper[4804]: I0217 13:50:20.865303 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:20 crc kubenswrapper[4804]: I0217 13:50:20.867572 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Feb 17 13:50:20 crc kubenswrapper[4804]: I0217 13:50:20.903584 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-h2xmr"] Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.027727 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gwlb\" (UniqueName: \"kubernetes.io/projected/21997817-4e92-40db-a990-377f9cb88575-kube-api-access-9gwlb\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.027977 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.028061 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-dns-svc\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.028126 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.028163 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.028226 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-config\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.028427 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.130424 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.130477 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-dns-svc\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.130504 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.130528 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.130556 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-config\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.130625 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.130700 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gwlb\" (UniqueName: \"kubernetes.io/projected/21997817-4e92-40db-a990-377f9cb88575-kube-api-access-9gwlb\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.131402 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.131412 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.131606 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.131686 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-config\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.132072 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-dns-svc\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.132529 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.159262 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gwlb\" (UniqueName: \"kubernetes.io/projected/21997817-4e92-40db-a990-377f9cb88575-kube-api-access-9gwlb\") pod \"dnsmasq-dns-d558885bc-h2xmr\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.198599 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:21 crc kubenswrapper[4804]: W0217 13:50:21.662734 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21997817_4e92_40db_a990_377f9cb88575.slice/crio-2b43909fd91c7907b0c4c4dede50166418cb4c4d2762c97bd67c0204aea48a8e WatchSource:0}: Error finding container 2b43909fd91c7907b0c4c4dede50166418cb4c4d2762c97bd67c0204aea48a8e: Status 404 returned error can't find the container with id 2b43909fd91c7907b0c4c4dede50166418cb4c4d2762c97bd67c0204aea48a8e Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.675049 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-h2xmr"] Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.688722 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4c7ecd09-cd15-439d-9153-b55d9013bb83","Type":"ContainerStarted","Data":"65ada82f256453be4311fa6e9f31586da6868d81949300ee7ece41d6b113174c"} Feb 17 13:50:21 crc kubenswrapper[4804]: I0217 13:50:21.692317 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-h2xmr" event={"ID":"21997817-4e92-40db-a990-377f9cb88575","Type":"ContainerStarted","Data":"2b43909fd91c7907b0c4c4dede50166418cb4c4d2762c97bd67c0204aea48a8e"} Feb 17 13:50:22 crc kubenswrapper[4804]: I0217 13:50:22.703521 4804 generic.go:334] "Generic (PLEG): container finished" podID="21997817-4e92-40db-a990-377f9cb88575" containerID="f545c90aa94e26fbc7cec07ad68557bb9c70b4171384426ba058a0b4bbc12151" exitCode=0 Feb 17 13:50:22 crc kubenswrapper[4804]: I0217 13:50:22.703602 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-h2xmr" event={"ID":"21997817-4e92-40db-a990-377f9cb88575","Type":"ContainerDied","Data":"f545c90aa94e26fbc7cec07ad68557bb9c70b4171384426ba058a0b4bbc12151"} Feb 17 13:50:23 crc kubenswrapper[4804]: I0217 13:50:23.716033 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-h2xmr" event={"ID":"21997817-4e92-40db-a990-377f9cb88575","Type":"ContainerStarted","Data":"1853184b1187f496d8adb4b288cb7e48e6f58ce71697566adfd0006d97f07416"} Feb 17 13:50:23 crc kubenswrapper[4804]: I0217 13:50:23.716354 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:23 crc kubenswrapper[4804]: I0217 13:50:23.750549 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d558885bc-h2xmr" podStartSLOduration=3.750530833 podStartE2EDuration="3.750530833s" podCreationTimestamp="2026-02-17 13:50:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:50:23.734326693 +0000 UTC m=+1497.845746030" watchObservedRunningTime="2026-02-17 13:50:23.750530833 +0000 UTC m=+1497.861950170" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.200361 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.285507 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-6rhsw"] Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.286164 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" podUID="1fd51afd-ae34-4a67-bb79-a12d396968ef" containerName="dnsmasq-dns" containerID="cri-o://d4cba7c814c257ff7ad9adc2a7917f2afa8d6f42aecd2a0509014a2bc501130b" gracePeriod=10 Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.465341 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78c64bc9c5-2n5kn"] Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.475610 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.490660 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78c64bc9c5-2n5kn"] Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.524723 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69619ab8-5a40-43b9-8e9c-1a6e39893605-config\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.524899 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69619ab8-5a40-43b9-8e9c-1a6e39893605-dns-svc\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.524923 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69619ab8-5a40-43b9-8e9c-1a6e39893605-ovsdbserver-nb\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.524972 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69619ab8-5a40-43b9-8e9c-1a6e39893605-ovsdbserver-sb\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.525107 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/69619ab8-5a40-43b9-8e9c-1a6e39893605-openstack-edpm-ipam\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.525285 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhqgn\" (UniqueName: \"kubernetes.io/projected/69619ab8-5a40-43b9-8e9c-1a6e39893605-kube-api-access-lhqgn\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.525424 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69619ab8-5a40-43b9-8e9c-1a6e39893605-dns-swift-storage-0\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.627259 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhqgn\" (UniqueName: \"kubernetes.io/projected/69619ab8-5a40-43b9-8e9c-1a6e39893605-kube-api-access-lhqgn\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.627340 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69619ab8-5a40-43b9-8e9c-1a6e39893605-dns-swift-storage-0\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.627424 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69619ab8-5a40-43b9-8e9c-1a6e39893605-config\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.627488 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69619ab8-5a40-43b9-8e9c-1a6e39893605-dns-svc\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.627506 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69619ab8-5a40-43b9-8e9c-1a6e39893605-ovsdbserver-nb\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.627547 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69619ab8-5a40-43b9-8e9c-1a6e39893605-ovsdbserver-sb\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.627639 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/69619ab8-5a40-43b9-8e9c-1a6e39893605-openstack-edpm-ipam\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.628831 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/69619ab8-5a40-43b9-8e9c-1a6e39893605-openstack-edpm-ipam\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.629742 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69619ab8-5a40-43b9-8e9c-1a6e39893605-dns-swift-storage-0\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.632240 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69619ab8-5a40-43b9-8e9c-1a6e39893605-dns-svc\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.632792 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69619ab8-5a40-43b9-8e9c-1a6e39893605-config\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.633108 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69619ab8-5a40-43b9-8e9c-1a6e39893605-ovsdbserver-sb\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.633363 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69619ab8-5a40-43b9-8e9c-1a6e39893605-ovsdbserver-nb\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.647557 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhqgn\" (UniqueName: \"kubernetes.io/projected/69619ab8-5a40-43b9-8e9c-1a6e39893605-kube-api-access-lhqgn\") pod \"dnsmasq-dns-78c64bc9c5-2n5kn\" (UID: \"69619ab8-5a40-43b9-8e9c-1a6e39893605\") " pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.796016 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.797779 4804 generic.go:334] "Generic (PLEG): container finished" podID="1fd51afd-ae34-4a67-bb79-a12d396968ef" containerID="d4cba7c814c257ff7ad9adc2a7917f2afa8d6f42aecd2a0509014a2bc501130b" exitCode=0 Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.797830 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" event={"ID":"1fd51afd-ae34-4a67-bb79-a12d396968ef","Type":"ContainerDied","Data":"d4cba7c814c257ff7ad9adc2a7917f2afa8d6f42aecd2a0509014a2bc501130b"} Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.797896 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" event={"ID":"1fd51afd-ae34-4a67-bb79-a12d396968ef","Type":"ContainerDied","Data":"a2838e3552cf9ee264c86b1e5acbfc8482d43bbc95f8a3776ff5253f31fed64a"} Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.797925 4804 scope.go:117] "RemoveContainer" containerID="d4cba7c814c257ff7ad9adc2a7917f2afa8d6f42aecd2a0509014a2bc501130b" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.807181 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.829846 4804 scope.go:117] "RemoveContainer" containerID="205ba7bc73ab52ae1a2e677c3dd98e1d2935baea4c419316aadd89754cd8dfba" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.897562 4804 scope.go:117] "RemoveContainer" containerID="d4cba7c814c257ff7ad9adc2a7917f2afa8d6f42aecd2a0509014a2bc501130b" Feb 17 13:50:31 crc kubenswrapper[4804]: E0217 13:50:31.898101 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4cba7c814c257ff7ad9adc2a7917f2afa8d6f42aecd2a0509014a2bc501130b\": container with ID starting with d4cba7c814c257ff7ad9adc2a7917f2afa8d6f42aecd2a0509014a2bc501130b not found: ID does not exist" containerID="d4cba7c814c257ff7ad9adc2a7917f2afa8d6f42aecd2a0509014a2bc501130b" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.898131 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4cba7c814c257ff7ad9adc2a7917f2afa8d6f42aecd2a0509014a2bc501130b"} err="failed to get container status \"d4cba7c814c257ff7ad9adc2a7917f2afa8d6f42aecd2a0509014a2bc501130b\": rpc error: code = NotFound desc = could not find container \"d4cba7c814c257ff7ad9adc2a7917f2afa8d6f42aecd2a0509014a2bc501130b\": container with ID starting with d4cba7c814c257ff7ad9adc2a7917f2afa8d6f42aecd2a0509014a2bc501130b not found: ID does not exist" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.898157 4804 scope.go:117] "RemoveContainer" containerID="205ba7bc73ab52ae1a2e677c3dd98e1d2935baea4c419316aadd89754cd8dfba" Feb 17 13:50:31 crc kubenswrapper[4804]: E0217 13:50:31.898542 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"205ba7bc73ab52ae1a2e677c3dd98e1d2935baea4c419316aadd89754cd8dfba\": container with ID starting with 205ba7bc73ab52ae1a2e677c3dd98e1d2935baea4c419316aadd89754cd8dfba not found: ID does not exist" containerID="205ba7bc73ab52ae1a2e677c3dd98e1d2935baea4c419316aadd89754cd8dfba" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.898579 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"205ba7bc73ab52ae1a2e677c3dd98e1d2935baea4c419316aadd89754cd8dfba"} err="failed to get container status \"205ba7bc73ab52ae1a2e677c3dd98e1d2935baea4c419316aadd89754cd8dfba\": rpc error: code = NotFound desc = could not find container \"205ba7bc73ab52ae1a2e677c3dd98e1d2935baea4c419316aadd89754cd8dfba\": container with ID starting with 205ba7bc73ab52ae1a2e677c3dd98e1d2935baea4c419316aadd89754cd8dfba not found: ID does not exist" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.938354 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-ovsdbserver-sb\") pod \"1fd51afd-ae34-4a67-bb79-a12d396968ef\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.938718 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-config\") pod \"1fd51afd-ae34-4a67-bb79-a12d396968ef\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.938751 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-ovsdbserver-nb\") pod \"1fd51afd-ae34-4a67-bb79-a12d396968ef\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.938795 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-dns-svc\") pod \"1fd51afd-ae34-4a67-bb79-a12d396968ef\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.938846 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-dns-swift-storage-0\") pod \"1fd51afd-ae34-4a67-bb79-a12d396968ef\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.938912 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jbtz\" (UniqueName: \"kubernetes.io/projected/1fd51afd-ae34-4a67-bb79-a12d396968ef-kube-api-access-5jbtz\") pod \"1fd51afd-ae34-4a67-bb79-a12d396968ef\" (UID: \"1fd51afd-ae34-4a67-bb79-a12d396968ef\") " Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.945584 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fd51afd-ae34-4a67-bb79-a12d396968ef-kube-api-access-5jbtz" (OuterVolumeSpecName: "kube-api-access-5jbtz") pod "1fd51afd-ae34-4a67-bb79-a12d396968ef" (UID: "1fd51afd-ae34-4a67-bb79-a12d396968ef"). InnerVolumeSpecName "kube-api-access-5jbtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:50:31 crc kubenswrapper[4804]: I0217 13:50:31.999090 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1fd51afd-ae34-4a67-bb79-a12d396968ef" (UID: "1fd51afd-ae34-4a67-bb79-a12d396968ef"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:50:32 crc kubenswrapper[4804]: I0217 13:50:32.010583 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-config" (OuterVolumeSpecName: "config") pod "1fd51afd-ae34-4a67-bb79-a12d396968ef" (UID: "1fd51afd-ae34-4a67-bb79-a12d396968ef"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:50:32 crc kubenswrapper[4804]: I0217 13:50:32.020049 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1fd51afd-ae34-4a67-bb79-a12d396968ef" (UID: "1fd51afd-ae34-4a67-bb79-a12d396968ef"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:50:32 crc kubenswrapper[4804]: I0217 13:50:32.028637 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1fd51afd-ae34-4a67-bb79-a12d396968ef" (UID: "1fd51afd-ae34-4a67-bb79-a12d396968ef"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:50:32 crc kubenswrapper[4804]: I0217 13:50:32.030519 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1fd51afd-ae34-4a67-bb79-a12d396968ef" (UID: "1fd51afd-ae34-4a67-bb79-a12d396968ef"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:50:32 crc kubenswrapper[4804]: I0217 13:50:32.041635 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:32 crc kubenswrapper[4804]: I0217 13:50:32.041669 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:32 crc kubenswrapper[4804]: I0217 13:50:32.041682 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:32 crc kubenswrapper[4804]: I0217 13:50:32.041694 4804 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:32 crc kubenswrapper[4804]: I0217 13:50:32.041705 4804 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fd51afd-ae34-4a67-bb79-a12d396968ef-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:32 crc kubenswrapper[4804]: I0217 13:50:32.041719 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jbtz\" (UniqueName: \"kubernetes.io/projected/1fd51afd-ae34-4a67-bb79-a12d396968ef-kube-api-access-5jbtz\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:32 crc kubenswrapper[4804]: I0217 13:50:32.268258 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78c64bc9c5-2n5kn"] Feb 17 13:50:32 crc kubenswrapper[4804]: W0217 13:50:32.271599 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69619ab8_5a40_43b9_8e9c_1a6e39893605.slice/crio-a2833db21abb080065d466cefa071c427cf380547cde57194f66ad2a0a2a0bb8 WatchSource:0}: Error finding container a2833db21abb080065d466cefa071c427cf380547cde57194f66ad2a0a2a0bb8: Status 404 returned error can't find the container with id a2833db21abb080065d466cefa071c427cf380547cde57194f66ad2a0a2a0bb8 Feb 17 13:50:32 crc kubenswrapper[4804]: I0217 13:50:32.810617 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-6rhsw" Feb 17 13:50:32 crc kubenswrapper[4804]: I0217 13:50:32.812370 4804 generic.go:334] "Generic (PLEG): container finished" podID="69619ab8-5a40-43b9-8e9c-1a6e39893605" containerID="b62be41e8ccaaa638798d42205e9b4a49fe20a084508b5c1cf72e1a16901037d" exitCode=0 Feb 17 13:50:32 crc kubenswrapper[4804]: I0217 13:50:32.812466 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" event={"ID":"69619ab8-5a40-43b9-8e9c-1a6e39893605","Type":"ContainerDied","Data":"b62be41e8ccaaa638798d42205e9b4a49fe20a084508b5c1cf72e1a16901037d"} Feb 17 13:50:32 crc kubenswrapper[4804]: I0217 13:50:32.812530 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" event={"ID":"69619ab8-5a40-43b9-8e9c-1a6e39893605","Type":"ContainerStarted","Data":"a2833db21abb080065d466cefa071c427cf380547cde57194f66ad2a0a2a0bb8"} Feb 17 13:50:32 crc kubenswrapper[4804]: I0217 13:50:32.865846 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-6rhsw"] Feb 17 13:50:32 crc kubenswrapper[4804]: I0217 13:50:32.871689 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-6rhsw"] Feb 17 13:50:33 crc kubenswrapper[4804]: I0217 13:50:33.822527 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" event={"ID":"69619ab8-5a40-43b9-8e9c-1a6e39893605","Type":"ContainerStarted","Data":"7588b9a633ef9fa67028868a7659fef6354d900704f38d8b0fff91f9262c248f"} Feb 17 13:50:33 crc kubenswrapper[4804]: I0217 13:50:33.823082 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:33 crc kubenswrapper[4804]: I0217 13:50:33.849304 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" podStartSLOduration=2.849283142 podStartE2EDuration="2.849283142s" podCreationTimestamp="2026-02-17 13:50:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:50:33.843801689 +0000 UTC m=+1507.955221026" watchObservedRunningTime="2026-02-17 13:50:33.849283142 +0000 UTC m=+1507.960702479" Feb 17 13:50:34 crc kubenswrapper[4804]: I0217 13:50:34.584617 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fd51afd-ae34-4a67-bb79-a12d396968ef" path="/var/lib/kubelet/pods/1fd51afd-ae34-4a67-bb79-a12d396968ef/volumes" Feb 17 13:50:41 crc kubenswrapper[4804]: I0217 13:50:41.809261 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78c64bc9c5-2n5kn" Feb 17 13:50:41 crc kubenswrapper[4804]: I0217 13:50:41.879775 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-h2xmr"] Feb 17 13:50:41 crc kubenswrapper[4804]: I0217 13:50:41.880048 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d558885bc-h2xmr" podUID="21997817-4e92-40db-a990-377f9cb88575" containerName="dnsmasq-dns" containerID="cri-o://1853184b1187f496d8adb4b288cb7e48e6f58ce71697566adfd0006d97f07416" gracePeriod=10 Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.410258 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.533816 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gwlb\" (UniqueName: \"kubernetes.io/projected/21997817-4e92-40db-a990-377f9cb88575-kube-api-access-9gwlb\") pod \"21997817-4e92-40db-a990-377f9cb88575\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.533884 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-config\") pod \"21997817-4e92-40db-a990-377f9cb88575\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.533946 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-ovsdbserver-sb\") pod \"21997817-4e92-40db-a990-377f9cb88575\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.534016 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-openstack-edpm-ipam\") pod \"21997817-4e92-40db-a990-377f9cb88575\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.534078 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-ovsdbserver-nb\") pod \"21997817-4e92-40db-a990-377f9cb88575\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.534093 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-dns-swift-storage-0\") pod \"21997817-4e92-40db-a990-377f9cb88575\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.534129 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-dns-svc\") pod \"21997817-4e92-40db-a990-377f9cb88575\" (UID: \"21997817-4e92-40db-a990-377f9cb88575\") " Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.542999 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21997817-4e92-40db-a990-377f9cb88575-kube-api-access-9gwlb" (OuterVolumeSpecName: "kube-api-access-9gwlb") pod "21997817-4e92-40db-a990-377f9cb88575" (UID: "21997817-4e92-40db-a990-377f9cb88575"). InnerVolumeSpecName "kube-api-access-9gwlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.587349 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "21997817-4e92-40db-a990-377f9cb88575" (UID: "21997817-4e92-40db-a990-377f9cb88575"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.594904 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "21997817-4e92-40db-a990-377f9cb88575" (UID: "21997817-4e92-40db-a990-377f9cb88575"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.599490 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-config" (OuterVolumeSpecName: "config") pod "21997817-4e92-40db-a990-377f9cb88575" (UID: "21997817-4e92-40db-a990-377f9cb88575"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.624639 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "21997817-4e92-40db-a990-377f9cb88575" (UID: "21997817-4e92-40db-a990-377f9cb88575"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.636490 4804 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.636521 4804 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.636531 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gwlb\" (UniqueName: \"kubernetes.io/projected/21997817-4e92-40db-a990-377f9cb88575-kube-api-access-9gwlb\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.636541 4804 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-config\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.636549 4804 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.637356 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "21997817-4e92-40db-a990-377f9cb88575" (UID: "21997817-4e92-40db-a990-377f9cb88575"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.637598 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "21997817-4e92-40db-a990-377f9cb88575" (UID: "21997817-4e92-40db-a990-377f9cb88575"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.738214 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.738440 4804 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21997817-4e92-40db-a990-377f9cb88575-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.901164 4804 generic.go:334] "Generic (PLEG): container finished" podID="21997817-4e92-40db-a990-377f9cb88575" containerID="1853184b1187f496d8adb4b288cb7e48e6f58ce71697566adfd0006d97f07416" exitCode=0 Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.901277 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-h2xmr" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.901281 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-h2xmr" event={"ID":"21997817-4e92-40db-a990-377f9cb88575","Type":"ContainerDied","Data":"1853184b1187f496d8adb4b288cb7e48e6f58ce71697566adfd0006d97f07416"} Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.901415 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-h2xmr" event={"ID":"21997817-4e92-40db-a990-377f9cb88575","Type":"ContainerDied","Data":"2b43909fd91c7907b0c4c4dede50166418cb4c4d2762c97bd67c0204aea48a8e"} Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.901439 4804 scope.go:117] "RemoveContainer" containerID="1853184b1187f496d8adb4b288cb7e48e6f58ce71697566adfd0006d97f07416" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.928170 4804 scope.go:117] "RemoveContainer" containerID="f545c90aa94e26fbc7cec07ad68557bb9c70b4171384426ba058a0b4bbc12151" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.944927 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-h2xmr"] Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.952450 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-h2xmr"] Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.972714 4804 scope.go:117] "RemoveContainer" containerID="1853184b1187f496d8adb4b288cb7e48e6f58ce71697566adfd0006d97f07416" Feb 17 13:50:42 crc kubenswrapper[4804]: E0217 13:50:42.973273 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1853184b1187f496d8adb4b288cb7e48e6f58ce71697566adfd0006d97f07416\": container with ID starting with 1853184b1187f496d8adb4b288cb7e48e6f58ce71697566adfd0006d97f07416 not found: ID does not exist" containerID="1853184b1187f496d8adb4b288cb7e48e6f58ce71697566adfd0006d97f07416" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.973323 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1853184b1187f496d8adb4b288cb7e48e6f58ce71697566adfd0006d97f07416"} err="failed to get container status \"1853184b1187f496d8adb4b288cb7e48e6f58ce71697566adfd0006d97f07416\": rpc error: code = NotFound desc = could not find container \"1853184b1187f496d8adb4b288cb7e48e6f58ce71697566adfd0006d97f07416\": container with ID starting with 1853184b1187f496d8adb4b288cb7e48e6f58ce71697566adfd0006d97f07416 not found: ID does not exist" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.973379 4804 scope.go:117] "RemoveContainer" containerID="f545c90aa94e26fbc7cec07ad68557bb9c70b4171384426ba058a0b4bbc12151" Feb 17 13:50:42 crc kubenswrapper[4804]: E0217 13:50:42.974934 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f545c90aa94e26fbc7cec07ad68557bb9c70b4171384426ba058a0b4bbc12151\": container with ID starting with f545c90aa94e26fbc7cec07ad68557bb9c70b4171384426ba058a0b4bbc12151 not found: ID does not exist" containerID="f545c90aa94e26fbc7cec07ad68557bb9c70b4171384426ba058a0b4bbc12151" Feb 17 13:50:42 crc kubenswrapper[4804]: I0217 13:50:42.974968 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f545c90aa94e26fbc7cec07ad68557bb9c70b4171384426ba058a0b4bbc12151"} err="failed to get container status \"f545c90aa94e26fbc7cec07ad68557bb9c70b4171384426ba058a0b4bbc12151\": rpc error: code = NotFound desc = could not find container \"f545c90aa94e26fbc7cec07ad68557bb9c70b4171384426ba058a0b4bbc12151\": container with ID starting with f545c90aa94e26fbc7cec07ad68557bb9c70b4171384426ba058a0b4bbc12151 not found: ID does not exist" Feb 17 13:50:44 crc kubenswrapper[4804]: I0217 13:50:44.590251 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21997817-4e92-40db-a990-377f9cb88575" path="/var/lib/kubelet/pods/21997817-4e92-40db-a990-377f9cb88575/volumes" Feb 17 13:50:51 crc kubenswrapper[4804]: I0217 13:50:51.991359 4804 generic.go:334] "Generic (PLEG): container finished" podID="b5f204e4-3b7a-4490-9c78-def5bf30f810" containerID="bbcf95a398f7b865c615ecb45334f7dffb36eeee0c13f1cb51c751c688537e45" exitCode=0 Feb 17 13:50:51 crc kubenswrapper[4804]: I0217 13:50:51.991467 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b5f204e4-3b7a-4490-9c78-def5bf30f810","Type":"ContainerDied","Data":"bbcf95a398f7b865c615ecb45334f7dffb36eeee0c13f1cb51c751c688537e45"} Feb 17 13:50:53 crc kubenswrapper[4804]: I0217 13:50:53.003356 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b5f204e4-3b7a-4490-9c78-def5bf30f810","Type":"ContainerStarted","Data":"f37813e04ded1da97dc64ecb8e05603ecfccc96d0e8da449644b588c033afc96"} Feb 17 13:50:53 crc kubenswrapper[4804]: I0217 13:50:53.004059 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 17 13:50:53 crc kubenswrapper[4804]: I0217 13:50:53.041810 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.041789558 podStartE2EDuration="37.041789558s" podCreationTimestamp="2026-02-17 13:50:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:50:53.027286022 +0000 UTC m=+1527.138705439" watchObservedRunningTime="2026-02-17 13:50:53.041789558 +0000 UTC m=+1527.153208905" Feb 17 13:50:54 crc kubenswrapper[4804]: I0217 13:50:54.013526 4804 generic.go:334] "Generic (PLEG): container finished" podID="4c7ecd09-cd15-439d-9153-b55d9013bb83" containerID="65ada82f256453be4311fa6e9f31586da6868d81949300ee7ece41d6b113174c" exitCode=0 Feb 17 13:50:54 crc kubenswrapper[4804]: I0217 13:50:54.014520 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4c7ecd09-cd15-439d-9153-b55d9013bb83","Type":"ContainerDied","Data":"65ada82f256453be4311fa6e9f31586da6868d81949300ee7ece41d6b113174c"} Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.025414 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4c7ecd09-cd15-439d-9153-b55d9013bb83","Type":"ContainerStarted","Data":"08e4011d2774f96d2bbb1f16dca387dc71133a1e4c6dfe8de710cb158c3a61a1"} Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.027114 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.056347 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.056327438 podStartE2EDuration="37.056327438s" podCreationTimestamp="2026-02-17 13:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 13:50:55.052158667 +0000 UTC m=+1529.163578024" watchObservedRunningTime="2026-02-17 13:50:55.056327438 +0000 UTC m=+1529.167746775" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.079766 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst"] Feb 17 13:50:55 crc kubenswrapper[4804]: E0217 13:50:55.080815 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fd51afd-ae34-4a67-bb79-a12d396968ef" containerName="dnsmasq-dns" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.080843 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fd51afd-ae34-4a67-bb79-a12d396968ef" containerName="dnsmasq-dns" Feb 17 13:50:55 crc kubenswrapper[4804]: E0217 13:50:55.080892 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fd51afd-ae34-4a67-bb79-a12d396968ef" containerName="init" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.080903 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fd51afd-ae34-4a67-bb79-a12d396968ef" containerName="init" Feb 17 13:50:55 crc kubenswrapper[4804]: E0217 13:50:55.080934 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21997817-4e92-40db-a990-377f9cb88575" containerName="init" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.080946 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="21997817-4e92-40db-a990-377f9cb88575" containerName="init" Feb 17 13:50:55 crc kubenswrapper[4804]: E0217 13:50:55.080964 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21997817-4e92-40db-a990-377f9cb88575" containerName="dnsmasq-dns" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.080971 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="21997817-4e92-40db-a990-377f9cb88575" containerName="dnsmasq-dns" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.081498 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="21997817-4e92-40db-a990-377f9cb88575" containerName="dnsmasq-dns" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.081544 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fd51afd-ae34-4a67-bb79-a12d396968ef" containerName="dnsmasq-dns" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.082542 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.088543 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-29gqz" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.088977 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.089284 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.101732 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.137032 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst"] Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.188643 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s77m\" (UniqueName: \"kubernetes.io/projected/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-kube-api-access-9s77m\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zctst\" (UID: \"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.188766 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zctst\" (UID: \"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.188788 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zctst\" (UID: \"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.188806 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zctst\" (UID: \"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.290923 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s77m\" (UniqueName: \"kubernetes.io/projected/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-kube-api-access-9s77m\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zctst\" (UID: \"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.291069 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zctst\" (UID: \"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.291096 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zctst\" (UID: \"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.291123 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zctst\" (UID: \"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.296073 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zctst\" (UID: \"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.296358 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zctst\" (UID: \"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.297875 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zctst\" (UID: \"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.320882 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s77m\" (UniqueName: \"kubernetes.io/projected/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-kube-api-access-9s77m\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zctst\" (UID: \"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.407560 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" Feb 17 13:50:55 crc kubenswrapper[4804]: I0217 13:50:55.990624 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst"] Feb 17 13:50:56 crc kubenswrapper[4804]: I0217 13:50:56.000968 4804 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 13:50:56 crc kubenswrapper[4804]: I0217 13:50:56.052937 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" event={"ID":"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd","Type":"ContainerStarted","Data":"05e7fc719fab70095a5b91a4cef4c9ad73a02dbe075b07303a0bfd47bc2532bd"} Feb 17 13:51:07 crc kubenswrapper[4804]: I0217 13:51:07.326602 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 17 13:51:09 crc kubenswrapper[4804]: I0217 13:51:09.351393 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 17 13:51:09 crc kubenswrapper[4804]: I0217 13:51:09.821156 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gp6sw"] Feb 17 13:51:09 crc kubenswrapper[4804]: I0217 13:51:09.823637 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gp6sw" Feb 17 13:51:09 crc kubenswrapper[4804]: I0217 13:51:09.831873 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gp6sw"] Feb 17 13:51:09 crc kubenswrapper[4804]: I0217 13:51:09.897661 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceeb88bc-5350-49f1-9a6b-0ecb88f986e4-utilities\") pod \"certified-operators-gp6sw\" (UID: \"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4\") " pod="openshift-marketplace/certified-operators-gp6sw" Feb 17 13:51:09 crc kubenswrapper[4804]: I0217 13:51:09.897792 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9m5f\" (UniqueName: \"kubernetes.io/projected/ceeb88bc-5350-49f1-9a6b-0ecb88f986e4-kube-api-access-l9m5f\") pod \"certified-operators-gp6sw\" (UID: \"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4\") " pod="openshift-marketplace/certified-operators-gp6sw" Feb 17 13:51:09 crc kubenswrapper[4804]: I0217 13:51:09.897885 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceeb88bc-5350-49f1-9a6b-0ecb88f986e4-catalog-content\") pod \"certified-operators-gp6sw\" (UID: \"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4\") " pod="openshift-marketplace/certified-operators-gp6sw" Feb 17 13:51:10 crc kubenswrapper[4804]: I0217 13:51:10.000990 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceeb88bc-5350-49f1-9a6b-0ecb88f986e4-catalog-content\") pod \"certified-operators-gp6sw\" (UID: \"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4\") " pod="openshift-marketplace/certified-operators-gp6sw" Feb 17 13:51:10 crc kubenswrapper[4804]: I0217 13:51:10.001174 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceeb88bc-5350-49f1-9a6b-0ecb88f986e4-utilities\") pod \"certified-operators-gp6sw\" (UID: \"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4\") " pod="openshift-marketplace/certified-operators-gp6sw" Feb 17 13:51:10 crc kubenswrapper[4804]: I0217 13:51:10.001277 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9m5f\" (UniqueName: \"kubernetes.io/projected/ceeb88bc-5350-49f1-9a6b-0ecb88f986e4-kube-api-access-l9m5f\") pod \"certified-operators-gp6sw\" (UID: \"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4\") " pod="openshift-marketplace/certified-operators-gp6sw" Feb 17 13:51:10 crc kubenswrapper[4804]: I0217 13:51:10.002067 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceeb88bc-5350-49f1-9a6b-0ecb88f986e4-utilities\") pod \"certified-operators-gp6sw\" (UID: \"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4\") " pod="openshift-marketplace/certified-operators-gp6sw" Feb 17 13:51:10 crc kubenswrapper[4804]: I0217 13:51:10.002132 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceeb88bc-5350-49f1-9a6b-0ecb88f986e4-catalog-content\") pod \"certified-operators-gp6sw\" (UID: \"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4\") " pod="openshift-marketplace/certified-operators-gp6sw" Feb 17 13:51:10 crc kubenswrapper[4804]: I0217 13:51:10.022909 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9m5f\" (UniqueName: \"kubernetes.io/projected/ceeb88bc-5350-49f1-9a6b-0ecb88f986e4-kube-api-access-l9m5f\") pod \"certified-operators-gp6sw\" (UID: \"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4\") " pod="openshift-marketplace/certified-operators-gp6sw" Feb 17 13:51:10 crc kubenswrapper[4804]: I0217 13:51:10.146652 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gp6sw" Feb 17 13:51:11 crc kubenswrapper[4804]: I0217 13:51:11.230137 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" event={"ID":"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd","Type":"ContainerStarted","Data":"0232936103e32990e0e1d2addaaa872375bd84efce8c9daca39d54b7e7e36e24"} Feb 17 13:51:11 crc kubenswrapper[4804]: I0217 13:51:11.255961 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" podStartSLOduration=1.299775653 podStartE2EDuration="16.255942397s" podCreationTimestamp="2026-02-17 13:50:55 +0000 UTC" firstStartedPulling="2026-02-17 13:50:56.000559743 +0000 UTC m=+1530.111979120" lastFinishedPulling="2026-02-17 13:51:10.956726527 +0000 UTC m=+1545.068145864" observedRunningTime="2026-02-17 13:51:11.247388089 +0000 UTC m=+1545.358807426" watchObservedRunningTime="2026-02-17 13:51:11.255942397 +0000 UTC m=+1545.367361734" Feb 17 13:51:11 crc kubenswrapper[4804]: I0217 13:51:11.421958 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gp6sw"] Feb 17 13:51:11 crc kubenswrapper[4804]: W0217 13:51:11.430441 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podceeb88bc_5350_49f1_9a6b_0ecb88f986e4.slice/crio-a85d096fbf07b9a8e92b330284d80c0c7cd763b8369a93e042fb58cb78dcbd2b WatchSource:0}: Error finding container a85d096fbf07b9a8e92b330284d80c0c7cd763b8369a93e042fb58cb78dcbd2b: Status 404 returned error can't find the container with id a85d096fbf07b9a8e92b330284d80c0c7cd763b8369a93e042fb58cb78dcbd2b Feb 17 13:51:12 crc kubenswrapper[4804]: I0217 13:51:12.246289 4804 generic.go:334] "Generic (PLEG): container finished" podID="ceeb88bc-5350-49f1-9a6b-0ecb88f986e4" containerID="ea493416ba3c0cbb2f9663a54ae4bfe816e4d282842e5140c568a1be6cad8519" exitCode=0 Feb 17 13:51:12 crc kubenswrapper[4804]: I0217 13:51:12.246383 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gp6sw" event={"ID":"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4","Type":"ContainerDied","Data":"ea493416ba3c0cbb2f9663a54ae4bfe816e4d282842e5140c568a1be6cad8519"} Feb 17 13:51:12 crc kubenswrapper[4804]: I0217 13:51:12.247442 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gp6sw" event={"ID":"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4","Type":"ContainerStarted","Data":"a85d096fbf07b9a8e92b330284d80c0c7cd763b8369a93e042fb58cb78dcbd2b"} Feb 17 13:51:13 crc kubenswrapper[4804]: I0217 13:51:13.264459 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gp6sw" event={"ID":"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4","Type":"ContainerStarted","Data":"6e551329a126bdc60b3a5a9061abc08b824a07f0c524616126c7cb74bd4b75a2"} Feb 17 13:51:14 crc kubenswrapper[4804]: I0217 13:51:14.276630 4804 generic.go:334] "Generic (PLEG): container finished" podID="ceeb88bc-5350-49f1-9a6b-0ecb88f986e4" containerID="6e551329a126bdc60b3a5a9061abc08b824a07f0c524616126c7cb74bd4b75a2" exitCode=0 Feb 17 13:51:14 crc kubenswrapper[4804]: I0217 13:51:14.276753 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gp6sw" event={"ID":"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4","Type":"ContainerDied","Data":"6e551329a126bdc60b3a5a9061abc08b824a07f0c524616126c7cb74bd4b75a2"} Feb 17 13:51:15 crc kubenswrapper[4804]: I0217 13:51:15.287579 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gp6sw" event={"ID":"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4","Type":"ContainerStarted","Data":"1cb515e4542d2839ebe4a87276ecda3fb68d6f641ed04abf474167ae98e16bbb"} Feb 17 13:51:15 crc kubenswrapper[4804]: I0217 13:51:15.313364 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gp6sw" podStartSLOduration=3.9140360100000002 podStartE2EDuration="6.313343978s" podCreationTimestamp="2026-02-17 13:51:09 +0000 UTC" firstStartedPulling="2026-02-17 13:51:12.249813062 +0000 UTC m=+1546.361232429" lastFinishedPulling="2026-02-17 13:51:14.64912102 +0000 UTC m=+1548.760540397" observedRunningTime="2026-02-17 13:51:15.305481 +0000 UTC m=+1549.416900337" watchObservedRunningTime="2026-02-17 13:51:15.313343978 +0000 UTC m=+1549.424763315" Feb 17 13:51:20 crc kubenswrapper[4804]: I0217 13:51:20.147751 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gp6sw" Feb 17 13:51:20 crc kubenswrapper[4804]: I0217 13:51:20.148420 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gp6sw" Feb 17 13:51:20 crc kubenswrapper[4804]: I0217 13:51:20.201168 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gp6sw" Feb 17 13:51:20 crc kubenswrapper[4804]: I0217 13:51:20.379540 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gp6sw" Feb 17 13:51:20 crc kubenswrapper[4804]: I0217 13:51:20.441550 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gp6sw"] Feb 17 13:51:22 crc kubenswrapper[4804]: I0217 13:51:22.354286 4804 generic.go:334] "Generic (PLEG): container finished" podID="ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd" containerID="0232936103e32990e0e1d2addaaa872375bd84efce8c9daca39d54b7e7e36e24" exitCode=0 Feb 17 13:51:22 crc kubenswrapper[4804]: I0217 13:51:22.354364 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" event={"ID":"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd","Type":"ContainerDied","Data":"0232936103e32990e0e1d2addaaa872375bd84efce8c9daca39d54b7e7e36e24"} Feb 17 13:51:22 crc kubenswrapper[4804]: I0217 13:51:22.354704 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gp6sw" podUID="ceeb88bc-5350-49f1-9a6b-0ecb88f986e4" containerName="registry-server" containerID="cri-o://1cb515e4542d2839ebe4a87276ecda3fb68d6f641ed04abf474167ae98e16bbb" gracePeriod=2 Feb 17 13:51:22 crc kubenswrapper[4804]: I0217 13:51:22.848621 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gp6sw" Feb 17 13:51:22 crc kubenswrapper[4804]: I0217 13:51:22.985443 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceeb88bc-5350-49f1-9a6b-0ecb88f986e4-catalog-content\") pod \"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4\" (UID: \"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4\") " Feb 17 13:51:22 crc kubenswrapper[4804]: I0217 13:51:22.985629 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceeb88bc-5350-49f1-9a6b-0ecb88f986e4-utilities\") pod \"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4\" (UID: \"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4\") " Feb 17 13:51:22 crc kubenswrapper[4804]: I0217 13:51:22.985742 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9m5f\" (UniqueName: \"kubernetes.io/projected/ceeb88bc-5350-49f1-9a6b-0ecb88f986e4-kube-api-access-l9m5f\") pod \"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4\" (UID: \"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4\") " Feb 17 13:51:22 crc kubenswrapper[4804]: I0217 13:51:22.986946 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ceeb88bc-5350-49f1-9a6b-0ecb88f986e4-utilities" (OuterVolumeSpecName: "utilities") pod "ceeb88bc-5350-49f1-9a6b-0ecb88f986e4" (UID: "ceeb88bc-5350-49f1-9a6b-0ecb88f986e4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:51:22 crc kubenswrapper[4804]: I0217 13:51:22.993098 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceeb88bc-5350-49f1-9a6b-0ecb88f986e4-kube-api-access-l9m5f" (OuterVolumeSpecName: "kube-api-access-l9m5f") pod "ceeb88bc-5350-49f1-9a6b-0ecb88f986e4" (UID: "ceeb88bc-5350-49f1-9a6b-0ecb88f986e4"). InnerVolumeSpecName "kube-api-access-l9m5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:51:23 crc kubenswrapper[4804]: I0217 13:51:23.052066 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ceeb88bc-5350-49f1-9a6b-0ecb88f986e4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ceeb88bc-5350-49f1-9a6b-0ecb88f986e4" (UID: "ceeb88bc-5350-49f1-9a6b-0ecb88f986e4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:51:23 crc kubenswrapper[4804]: I0217 13:51:23.088007 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceeb88bc-5350-49f1-9a6b-0ecb88f986e4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 13:51:23 crc kubenswrapper[4804]: I0217 13:51:23.088052 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceeb88bc-5350-49f1-9a6b-0ecb88f986e4-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 13:51:23 crc kubenswrapper[4804]: I0217 13:51:23.088067 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9m5f\" (UniqueName: \"kubernetes.io/projected/ceeb88bc-5350-49f1-9a6b-0ecb88f986e4-kube-api-access-l9m5f\") on node \"crc\" DevicePath \"\"" Feb 17 13:51:23 crc kubenswrapper[4804]: I0217 13:51:23.365388 4804 generic.go:334] "Generic (PLEG): container finished" podID="ceeb88bc-5350-49f1-9a6b-0ecb88f986e4" containerID="1cb515e4542d2839ebe4a87276ecda3fb68d6f641ed04abf474167ae98e16bbb" exitCode=0 Feb 17 13:51:23 crc kubenswrapper[4804]: I0217 13:51:23.365503 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gp6sw" event={"ID":"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4","Type":"ContainerDied","Data":"1cb515e4542d2839ebe4a87276ecda3fb68d6f641ed04abf474167ae98e16bbb"} Feb 17 13:51:23 crc kubenswrapper[4804]: I0217 13:51:23.365752 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gp6sw" event={"ID":"ceeb88bc-5350-49f1-9a6b-0ecb88f986e4","Type":"ContainerDied","Data":"a85d096fbf07b9a8e92b330284d80c0c7cd763b8369a93e042fb58cb78dcbd2b"} Feb 17 13:51:23 crc kubenswrapper[4804]: I0217 13:51:23.365779 4804 scope.go:117] "RemoveContainer" containerID="1cb515e4542d2839ebe4a87276ecda3fb68d6f641ed04abf474167ae98e16bbb" Feb 17 13:51:23 crc kubenswrapper[4804]: I0217 13:51:23.365553 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gp6sw" Feb 17 13:51:23 crc kubenswrapper[4804]: I0217 13:51:23.413451 4804 scope.go:117] "RemoveContainer" containerID="6e551329a126bdc60b3a5a9061abc08b824a07f0c524616126c7cb74bd4b75a2" Feb 17 13:51:23 crc kubenswrapper[4804]: I0217 13:51:23.424116 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gp6sw"] Feb 17 13:51:23 crc kubenswrapper[4804]: I0217 13:51:23.432340 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gp6sw"] Feb 17 13:51:23 crc kubenswrapper[4804]: I0217 13:51:23.503279 4804 scope.go:117] "RemoveContainer" containerID="ea493416ba3c0cbb2f9663a54ae4bfe816e4d282842e5140c568a1be6cad8519" Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:23.571053 4804 scope.go:117] "RemoveContainer" containerID="1cb515e4542d2839ebe4a87276ecda3fb68d6f641ed04abf474167ae98e16bbb" Feb 17 13:51:24 crc kubenswrapper[4804]: E0217 13:51:23.571465 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cb515e4542d2839ebe4a87276ecda3fb68d6f641ed04abf474167ae98e16bbb\": container with ID starting with 1cb515e4542d2839ebe4a87276ecda3fb68d6f641ed04abf474167ae98e16bbb not found: ID does not exist" containerID="1cb515e4542d2839ebe4a87276ecda3fb68d6f641ed04abf474167ae98e16bbb" Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:23.571502 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cb515e4542d2839ebe4a87276ecda3fb68d6f641ed04abf474167ae98e16bbb"} err="failed to get container status \"1cb515e4542d2839ebe4a87276ecda3fb68d6f641ed04abf474167ae98e16bbb\": rpc error: code = NotFound desc = could not find container \"1cb515e4542d2839ebe4a87276ecda3fb68d6f641ed04abf474167ae98e16bbb\": container with ID starting with 1cb515e4542d2839ebe4a87276ecda3fb68d6f641ed04abf474167ae98e16bbb not found: ID does not exist" Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:23.571530 4804 scope.go:117] "RemoveContainer" containerID="6e551329a126bdc60b3a5a9061abc08b824a07f0c524616126c7cb74bd4b75a2" Feb 17 13:51:24 crc kubenswrapper[4804]: E0217 13:51:23.571822 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e551329a126bdc60b3a5a9061abc08b824a07f0c524616126c7cb74bd4b75a2\": container with ID starting with 6e551329a126bdc60b3a5a9061abc08b824a07f0c524616126c7cb74bd4b75a2 not found: ID does not exist" containerID="6e551329a126bdc60b3a5a9061abc08b824a07f0c524616126c7cb74bd4b75a2" Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:23.571851 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e551329a126bdc60b3a5a9061abc08b824a07f0c524616126c7cb74bd4b75a2"} err="failed to get container status \"6e551329a126bdc60b3a5a9061abc08b824a07f0c524616126c7cb74bd4b75a2\": rpc error: code = NotFound desc = could not find container \"6e551329a126bdc60b3a5a9061abc08b824a07f0c524616126c7cb74bd4b75a2\": container with ID starting with 6e551329a126bdc60b3a5a9061abc08b824a07f0c524616126c7cb74bd4b75a2 not found: ID does not exist" Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:23.571863 4804 scope.go:117] "RemoveContainer" containerID="ea493416ba3c0cbb2f9663a54ae4bfe816e4d282842e5140c568a1be6cad8519" Feb 17 13:51:24 crc kubenswrapper[4804]: E0217 13:51:23.572733 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea493416ba3c0cbb2f9663a54ae4bfe816e4d282842e5140c568a1be6cad8519\": container with ID starting with ea493416ba3c0cbb2f9663a54ae4bfe816e4d282842e5140c568a1be6cad8519 not found: ID does not exist" containerID="ea493416ba3c0cbb2f9663a54ae4bfe816e4d282842e5140c568a1be6cad8519" Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:23.572751 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea493416ba3c0cbb2f9663a54ae4bfe816e4d282842e5140c568a1be6cad8519"} err="failed to get container status \"ea493416ba3c0cbb2f9663a54ae4bfe816e4d282842e5140c568a1be6cad8519\": rpc error: code = NotFound desc = could not find container \"ea493416ba3c0cbb2f9663a54ae4bfe816e4d282842e5140c568a1be6cad8519\": container with ID starting with ea493416ba3c0cbb2f9663a54ae4bfe816e4d282842e5140c568a1be6cad8519 not found: ID does not exist" Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:24.377885 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" event={"ID":"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd","Type":"ContainerDied","Data":"05e7fc719fab70095a5b91a4cef4c9ad73a02dbe075b07303a0bfd47bc2532bd"} Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:24.378252 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05e7fc719fab70095a5b91a4cef4c9ad73a02dbe075b07303a0bfd47bc2532bd" Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:24.454038 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:24.585715 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ceeb88bc-5350-49f1-9a6b-0ecb88f986e4" path="/var/lib/kubelet/pods/ceeb88bc-5350-49f1-9a6b-0ecb88f986e4/volumes" Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:24.620770 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-repo-setup-combined-ca-bundle\") pod \"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd\" (UID: \"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd\") " Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:24.621045 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-ssh-key-openstack-edpm-ipam\") pod \"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd\" (UID: \"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd\") " Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:24.621921 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-inventory\") pod \"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd\" (UID: \"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd\") " Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:24.622045 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9s77m\" (UniqueName: \"kubernetes.io/projected/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-kube-api-access-9s77m\") pod \"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd\" (UID: \"ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd\") " Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:24.627877 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-kube-api-access-9s77m" (OuterVolumeSpecName: "kube-api-access-9s77m") pod "ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd" (UID: "ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd"). InnerVolumeSpecName "kube-api-access-9s77m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:24.628002 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd" (UID: "ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:24.651379 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-inventory" (OuterVolumeSpecName: "inventory") pod "ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd" (UID: "ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:24.651388 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd" (UID: "ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:24.725262 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9s77m\" (UniqueName: \"kubernetes.io/projected/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-kube-api-access-9s77m\") on node \"crc\" DevicePath \"\"" Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:24.725301 4804 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:24.725312 4804 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 13:51:24 crc kubenswrapper[4804]: I0217 13:51:24.725324 4804 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.390807 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zctst" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.561217 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f"] Feb 17 13:51:25 crc kubenswrapper[4804]: E0217 13:51:25.561850 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.561886 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 17 13:51:25 crc kubenswrapper[4804]: E0217 13:51:25.561917 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceeb88bc-5350-49f1-9a6b-0ecb88f986e4" containerName="extract-utilities" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.561930 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceeb88bc-5350-49f1-9a6b-0ecb88f986e4" containerName="extract-utilities" Feb 17 13:51:25 crc kubenswrapper[4804]: E0217 13:51:25.561973 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceeb88bc-5350-49f1-9a6b-0ecb88f986e4" containerName="extract-content" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.561985 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceeb88bc-5350-49f1-9a6b-0ecb88f986e4" containerName="extract-content" Feb 17 13:51:25 crc kubenswrapper[4804]: E0217 13:51:25.562001 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceeb88bc-5350-49f1-9a6b-0ecb88f986e4" containerName="registry-server" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.562012 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceeb88bc-5350-49f1-9a6b-0ecb88f986e4" containerName="registry-server" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.562339 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceeb88bc-5350-49f1-9a6b-0ecb88f986e4" containerName="registry-server" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.562390 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.563472 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.565613 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.566007 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.566233 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-29gqz" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.568411 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.569000 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f"] Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.642109 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9rtz\" (UniqueName: \"kubernetes.io/projected/c87b0376-c505-452b-90ed-0e6bb7e6e8e0-kube-api-access-z9rtz\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-z6s9f\" (UID: \"c87b0376-c505-452b-90ed-0e6bb7e6e8e0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.642699 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c87b0376-c505-452b-90ed-0e6bb7e6e8e0-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-z6s9f\" (UID: \"c87b0376-c505-452b-90ed-0e6bb7e6e8e0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.642843 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c87b0376-c505-452b-90ed-0e6bb7e6e8e0-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-z6s9f\" (UID: \"c87b0376-c505-452b-90ed-0e6bb7e6e8e0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.744738 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9rtz\" (UniqueName: \"kubernetes.io/projected/c87b0376-c505-452b-90ed-0e6bb7e6e8e0-kube-api-access-z9rtz\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-z6s9f\" (UID: \"c87b0376-c505-452b-90ed-0e6bb7e6e8e0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.744836 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c87b0376-c505-452b-90ed-0e6bb7e6e8e0-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-z6s9f\" (UID: \"c87b0376-c505-452b-90ed-0e6bb7e6e8e0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.744866 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c87b0376-c505-452b-90ed-0e6bb7e6e8e0-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-z6s9f\" (UID: \"c87b0376-c505-452b-90ed-0e6bb7e6e8e0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.750960 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c87b0376-c505-452b-90ed-0e6bb7e6e8e0-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-z6s9f\" (UID: \"c87b0376-c505-452b-90ed-0e6bb7e6e8e0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.751713 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c87b0376-c505-452b-90ed-0e6bb7e6e8e0-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-z6s9f\" (UID: \"c87b0376-c505-452b-90ed-0e6bb7e6e8e0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.761271 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9rtz\" (UniqueName: \"kubernetes.io/projected/c87b0376-c505-452b-90ed-0e6bb7e6e8e0-kube-api-access-z9rtz\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-z6s9f\" (UID: \"c87b0376-c505-452b-90ed-0e6bb7e6e8e0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f" Feb 17 13:51:25 crc kubenswrapper[4804]: I0217 13:51:25.880479 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f" Feb 17 13:51:26 crc kubenswrapper[4804]: I0217 13:51:26.415688 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f"] Feb 17 13:51:26 crc kubenswrapper[4804]: I0217 13:51:26.631805 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 13:51:27 crc kubenswrapper[4804]: I0217 13:51:27.178565 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nb9wd"] Feb 17 13:51:27 crc kubenswrapper[4804]: I0217 13:51:27.182329 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nb9wd" Feb 17 13:51:27 crc kubenswrapper[4804]: I0217 13:51:27.195949 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nb9wd"] Feb 17 13:51:27 crc kubenswrapper[4804]: I0217 13:51:27.371500 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhqb5\" (UniqueName: \"kubernetes.io/projected/4b5520af-e860-4937-af9c-049b304c0cf9-kube-api-access-bhqb5\") pod \"redhat-marketplace-nb9wd\" (UID: \"4b5520af-e860-4937-af9c-049b304c0cf9\") " pod="openshift-marketplace/redhat-marketplace-nb9wd" Feb 17 13:51:27 crc kubenswrapper[4804]: I0217 13:51:27.371546 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b5520af-e860-4937-af9c-049b304c0cf9-utilities\") pod \"redhat-marketplace-nb9wd\" (UID: \"4b5520af-e860-4937-af9c-049b304c0cf9\") " pod="openshift-marketplace/redhat-marketplace-nb9wd" Feb 17 13:51:27 crc kubenswrapper[4804]: I0217 13:51:27.371693 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b5520af-e860-4937-af9c-049b304c0cf9-catalog-content\") pod \"redhat-marketplace-nb9wd\" (UID: \"4b5520af-e860-4937-af9c-049b304c0cf9\") " pod="openshift-marketplace/redhat-marketplace-nb9wd" Feb 17 13:51:27 crc kubenswrapper[4804]: I0217 13:51:27.409109 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f" event={"ID":"c87b0376-c505-452b-90ed-0e6bb7e6e8e0","Type":"ContainerStarted","Data":"4a217d28653fcb3108dc054ac2dd9db14b19f53aeacc55277c807dba99e6cd5f"} Feb 17 13:51:27 crc kubenswrapper[4804]: I0217 13:51:27.409159 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f" event={"ID":"c87b0376-c505-452b-90ed-0e6bb7e6e8e0","Type":"ContainerStarted","Data":"0db74678f890e06c2b9958a4e27efc6ebec25a1ff0a24b96d6b328c59548fcfc"} Feb 17 13:51:27 crc kubenswrapper[4804]: I0217 13:51:27.427136 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f" podStartSLOduration=2.231576782 podStartE2EDuration="2.427114015s" podCreationTimestamp="2026-02-17 13:51:25 +0000 UTC" firstStartedPulling="2026-02-17 13:51:26.431580078 +0000 UTC m=+1560.542999415" lastFinishedPulling="2026-02-17 13:51:26.627117311 +0000 UTC m=+1560.738536648" observedRunningTime="2026-02-17 13:51:27.422992645 +0000 UTC m=+1561.534411982" watchObservedRunningTime="2026-02-17 13:51:27.427114015 +0000 UTC m=+1561.538533352" Feb 17 13:51:27 crc kubenswrapper[4804]: I0217 13:51:27.473456 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhqb5\" (UniqueName: \"kubernetes.io/projected/4b5520af-e860-4937-af9c-049b304c0cf9-kube-api-access-bhqb5\") pod \"redhat-marketplace-nb9wd\" (UID: \"4b5520af-e860-4937-af9c-049b304c0cf9\") " pod="openshift-marketplace/redhat-marketplace-nb9wd" Feb 17 13:51:27 crc kubenswrapper[4804]: I0217 13:51:27.473519 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b5520af-e860-4937-af9c-049b304c0cf9-utilities\") pod \"redhat-marketplace-nb9wd\" (UID: \"4b5520af-e860-4937-af9c-049b304c0cf9\") " pod="openshift-marketplace/redhat-marketplace-nb9wd" Feb 17 13:51:27 crc kubenswrapper[4804]: I0217 13:51:27.473776 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b5520af-e860-4937-af9c-049b304c0cf9-catalog-content\") pod \"redhat-marketplace-nb9wd\" (UID: \"4b5520af-e860-4937-af9c-049b304c0cf9\") " pod="openshift-marketplace/redhat-marketplace-nb9wd" Feb 17 13:51:27 crc kubenswrapper[4804]: I0217 13:51:27.474023 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b5520af-e860-4937-af9c-049b304c0cf9-utilities\") pod \"redhat-marketplace-nb9wd\" (UID: \"4b5520af-e860-4937-af9c-049b304c0cf9\") " pod="openshift-marketplace/redhat-marketplace-nb9wd" Feb 17 13:51:27 crc kubenswrapper[4804]: I0217 13:51:27.474399 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b5520af-e860-4937-af9c-049b304c0cf9-catalog-content\") pod \"redhat-marketplace-nb9wd\" (UID: \"4b5520af-e860-4937-af9c-049b304c0cf9\") " pod="openshift-marketplace/redhat-marketplace-nb9wd" Feb 17 13:51:27 crc kubenswrapper[4804]: I0217 13:51:27.498442 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhqb5\" (UniqueName: \"kubernetes.io/projected/4b5520af-e860-4937-af9c-049b304c0cf9-kube-api-access-bhqb5\") pod \"redhat-marketplace-nb9wd\" (UID: \"4b5520af-e860-4937-af9c-049b304c0cf9\") " pod="openshift-marketplace/redhat-marketplace-nb9wd" Feb 17 13:51:27 crc kubenswrapper[4804]: I0217 13:51:27.501019 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nb9wd" Feb 17 13:51:28 crc kubenswrapper[4804]: I0217 13:51:28.008001 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nb9wd"] Feb 17 13:51:28 crc kubenswrapper[4804]: I0217 13:51:28.419432 4804 generic.go:334] "Generic (PLEG): container finished" podID="4b5520af-e860-4937-af9c-049b304c0cf9" containerID="1b7acd23a0c9385c3b2b090c72ed4c69fd030cf041066eb9a7870d78777b66c0" exitCode=0 Feb 17 13:51:28 crc kubenswrapper[4804]: I0217 13:51:28.419532 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nb9wd" event={"ID":"4b5520af-e860-4937-af9c-049b304c0cf9","Type":"ContainerDied","Data":"1b7acd23a0c9385c3b2b090c72ed4c69fd030cf041066eb9a7870d78777b66c0"} Feb 17 13:51:28 crc kubenswrapper[4804]: I0217 13:51:28.420423 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nb9wd" event={"ID":"4b5520af-e860-4937-af9c-049b304c0cf9","Type":"ContainerStarted","Data":"5096b0ade58765cbb70c123fde8ddf796f5301f72982d1f2729abe092a910d91"} Feb 17 13:51:29 crc kubenswrapper[4804]: I0217 13:51:29.430505 4804 generic.go:334] "Generic (PLEG): container finished" podID="c87b0376-c505-452b-90ed-0e6bb7e6e8e0" containerID="4a217d28653fcb3108dc054ac2dd9db14b19f53aeacc55277c807dba99e6cd5f" exitCode=0 Feb 17 13:51:29 crc kubenswrapper[4804]: I0217 13:51:29.430610 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f" event={"ID":"c87b0376-c505-452b-90ed-0e6bb7e6e8e0","Type":"ContainerDied","Data":"4a217d28653fcb3108dc054ac2dd9db14b19f53aeacc55277c807dba99e6cd5f"} Feb 17 13:51:29 crc kubenswrapper[4804]: I0217 13:51:29.436105 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nb9wd" event={"ID":"4b5520af-e860-4937-af9c-049b304c0cf9","Type":"ContainerStarted","Data":"a67e0fbe664a402c8f0dfc28851d9a83b3e1980152c4e8c2c7183643829b58b4"} Feb 17 13:51:30 crc kubenswrapper[4804]: I0217 13:51:30.445547 4804 generic.go:334] "Generic (PLEG): container finished" podID="4b5520af-e860-4937-af9c-049b304c0cf9" containerID="a67e0fbe664a402c8f0dfc28851d9a83b3e1980152c4e8c2c7183643829b58b4" exitCode=0 Feb 17 13:51:30 crc kubenswrapper[4804]: I0217 13:51:30.445620 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nb9wd" event={"ID":"4b5520af-e860-4937-af9c-049b304c0cf9","Type":"ContainerDied","Data":"a67e0fbe664a402c8f0dfc28851d9a83b3e1980152c4e8c2c7183643829b58b4"} Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.008671 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.170395 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c87b0376-c505-452b-90ed-0e6bb7e6e8e0-inventory\") pod \"c87b0376-c505-452b-90ed-0e6bb7e6e8e0\" (UID: \"c87b0376-c505-452b-90ed-0e6bb7e6e8e0\") " Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.170467 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c87b0376-c505-452b-90ed-0e6bb7e6e8e0-ssh-key-openstack-edpm-ipam\") pod \"c87b0376-c505-452b-90ed-0e6bb7e6e8e0\" (UID: \"c87b0376-c505-452b-90ed-0e6bb7e6e8e0\") " Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.170634 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9rtz\" (UniqueName: \"kubernetes.io/projected/c87b0376-c505-452b-90ed-0e6bb7e6e8e0-kube-api-access-z9rtz\") pod \"c87b0376-c505-452b-90ed-0e6bb7e6e8e0\" (UID: \"c87b0376-c505-452b-90ed-0e6bb7e6e8e0\") " Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.176810 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c87b0376-c505-452b-90ed-0e6bb7e6e8e0-kube-api-access-z9rtz" (OuterVolumeSpecName: "kube-api-access-z9rtz") pod "c87b0376-c505-452b-90ed-0e6bb7e6e8e0" (UID: "c87b0376-c505-452b-90ed-0e6bb7e6e8e0"). InnerVolumeSpecName "kube-api-access-z9rtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.201456 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c87b0376-c505-452b-90ed-0e6bb7e6e8e0-inventory" (OuterVolumeSpecName: "inventory") pod "c87b0376-c505-452b-90ed-0e6bb7e6e8e0" (UID: "c87b0376-c505-452b-90ed-0e6bb7e6e8e0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.201485 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c87b0376-c505-452b-90ed-0e6bb7e6e8e0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c87b0376-c505-452b-90ed-0e6bb7e6e8e0" (UID: "c87b0376-c505-452b-90ed-0e6bb7e6e8e0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.273237 4804 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c87b0376-c505-452b-90ed-0e6bb7e6e8e0-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.273272 4804 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c87b0376-c505-452b-90ed-0e6bb7e6e8e0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.273286 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9rtz\" (UniqueName: \"kubernetes.io/projected/c87b0376-c505-452b-90ed-0e6bb7e6e8e0-kube-api-access-z9rtz\") on node \"crc\" DevicePath \"\"" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.455766 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f" event={"ID":"c87b0376-c505-452b-90ed-0e6bb7e6e8e0","Type":"ContainerDied","Data":"0db74678f890e06c2b9958a4e27efc6ebec25a1ff0a24b96d6b328c59548fcfc"} Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.455825 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0db74678f890e06c2b9958a4e27efc6ebec25a1ff0a24b96d6b328c59548fcfc" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.455821 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z6s9f" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.530749 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p"] Feb 17 13:51:31 crc kubenswrapper[4804]: E0217 13:51:31.531274 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c87b0376-c505-452b-90ed-0e6bb7e6e8e0" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.531299 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="c87b0376-c505-452b-90ed-0e6bb7e6e8e0" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.531548 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="c87b0376-c505-452b-90ed-0e6bb7e6e8e0" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.532334 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.535448 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.535480 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.536296 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.537053 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-29gqz" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.553603 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p"] Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.679836 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ee075c2-2363-4446-8545-dfdece6ca4da-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p\" (UID: \"9ee075c2-2363-4446-8545-dfdece6ca4da\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.679937 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpnt6\" (UniqueName: \"kubernetes.io/projected/9ee075c2-2363-4446-8545-dfdece6ca4da-kube-api-access-bpnt6\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p\" (UID: \"9ee075c2-2363-4446-8545-dfdece6ca4da\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.680149 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ee075c2-2363-4446-8545-dfdece6ca4da-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p\" (UID: \"9ee075c2-2363-4446-8545-dfdece6ca4da\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.680370 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ee075c2-2363-4446-8545-dfdece6ca4da-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p\" (UID: \"9ee075c2-2363-4446-8545-dfdece6ca4da\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.782156 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ee075c2-2363-4446-8545-dfdece6ca4da-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p\" (UID: \"9ee075c2-2363-4446-8545-dfdece6ca4da\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.782297 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ee075c2-2363-4446-8545-dfdece6ca4da-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p\" (UID: \"9ee075c2-2363-4446-8545-dfdece6ca4da\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.782344 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpnt6\" (UniqueName: \"kubernetes.io/projected/9ee075c2-2363-4446-8545-dfdece6ca4da-kube-api-access-bpnt6\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p\" (UID: \"9ee075c2-2363-4446-8545-dfdece6ca4da\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.782444 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ee075c2-2363-4446-8545-dfdece6ca4da-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p\" (UID: \"9ee075c2-2363-4446-8545-dfdece6ca4da\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.786702 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ee075c2-2363-4446-8545-dfdece6ca4da-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p\" (UID: \"9ee075c2-2363-4446-8545-dfdece6ca4da\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.786722 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ee075c2-2363-4446-8545-dfdece6ca4da-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p\" (UID: \"9ee075c2-2363-4446-8545-dfdece6ca4da\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.790720 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ee075c2-2363-4446-8545-dfdece6ca4da-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p\" (UID: \"9ee075c2-2363-4446-8545-dfdece6ca4da\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.800932 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpnt6\" (UniqueName: \"kubernetes.io/projected/9ee075c2-2363-4446-8545-dfdece6ca4da-kube-api-access-bpnt6\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p\" (UID: \"9ee075c2-2363-4446-8545-dfdece6ca4da\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" Feb 17 13:51:31 crc kubenswrapper[4804]: I0217 13:51:31.853026 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" Feb 17 13:51:32 crc kubenswrapper[4804]: I0217 13:51:32.408138 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p"] Feb 17 13:51:32 crc kubenswrapper[4804]: W0217 13:51:32.419166 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ee075c2_2363_4446_8545_dfdece6ca4da.slice/crio-18b9b9e278e8c83dcf92ce23e64cdae2695221512b4169c9866fb3e6753d918e WatchSource:0}: Error finding container 18b9b9e278e8c83dcf92ce23e64cdae2695221512b4169c9866fb3e6753d918e: Status 404 returned error can't find the container with id 18b9b9e278e8c83dcf92ce23e64cdae2695221512b4169c9866fb3e6753d918e Feb 17 13:51:32 crc kubenswrapper[4804]: I0217 13:51:32.471406 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nb9wd" event={"ID":"4b5520af-e860-4937-af9c-049b304c0cf9","Type":"ContainerStarted","Data":"905e5dc8ee2e00573877efc055742673aa8e1bf7d562cb7634ae4819ea807bf9"} Feb 17 13:51:32 crc kubenswrapper[4804]: I0217 13:51:32.475253 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" event={"ID":"9ee075c2-2363-4446-8545-dfdece6ca4da","Type":"ContainerStarted","Data":"18b9b9e278e8c83dcf92ce23e64cdae2695221512b4169c9866fb3e6753d918e"} Feb 17 13:51:32 crc kubenswrapper[4804]: I0217 13:51:32.493976 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nb9wd" podStartSLOduration=1.747494457 podStartE2EDuration="5.493955989s" podCreationTimestamp="2026-02-17 13:51:27 +0000 UTC" firstStartedPulling="2026-02-17 13:51:28.421189835 +0000 UTC m=+1562.532609172" lastFinishedPulling="2026-02-17 13:51:32.167651367 +0000 UTC m=+1566.279070704" observedRunningTime="2026-02-17 13:51:32.489971363 +0000 UTC m=+1566.601390700" watchObservedRunningTime="2026-02-17 13:51:32.493955989 +0000 UTC m=+1566.605375336" Feb 17 13:51:33 crc kubenswrapper[4804]: I0217 13:51:33.486962 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" event={"ID":"9ee075c2-2363-4446-8545-dfdece6ca4da","Type":"ContainerStarted","Data":"c30c97b714db6eaaea3d99e426020e3d5b0cd168a7762b36fc6e65e7574bc11f"} Feb 17 13:51:33 crc kubenswrapper[4804]: I0217 13:51:33.506451 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" podStartSLOduration=2.328123494 podStartE2EDuration="2.506430447s" podCreationTimestamp="2026-02-17 13:51:31 +0000 UTC" firstStartedPulling="2026-02-17 13:51:32.422122911 +0000 UTC m=+1566.533542258" lastFinishedPulling="2026-02-17 13:51:32.600429874 +0000 UTC m=+1566.711849211" observedRunningTime="2026-02-17 13:51:33.50334423 +0000 UTC m=+1567.614763577" watchObservedRunningTime="2026-02-17 13:51:33.506430447 +0000 UTC m=+1567.617849784" Feb 17 13:51:37 crc kubenswrapper[4804]: I0217 13:51:37.501664 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nb9wd" Feb 17 13:51:37 crc kubenswrapper[4804]: I0217 13:51:37.502874 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nb9wd" Feb 17 13:51:37 crc kubenswrapper[4804]: I0217 13:51:37.552941 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nb9wd" Feb 17 13:51:37 crc kubenswrapper[4804]: I0217 13:51:37.607252 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nb9wd" Feb 17 13:51:37 crc kubenswrapper[4804]: I0217 13:51:37.796222 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nb9wd"] Feb 17 13:51:39 crc kubenswrapper[4804]: I0217 13:51:39.541240 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nb9wd" podUID="4b5520af-e860-4937-af9c-049b304c0cf9" containerName="registry-server" containerID="cri-o://905e5dc8ee2e00573877efc055742673aa8e1bf7d562cb7634ae4819ea807bf9" gracePeriod=2 Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.090721 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nb9wd" Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.247327 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhqb5\" (UniqueName: \"kubernetes.io/projected/4b5520af-e860-4937-af9c-049b304c0cf9-kube-api-access-bhqb5\") pod \"4b5520af-e860-4937-af9c-049b304c0cf9\" (UID: \"4b5520af-e860-4937-af9c-049b304c0cf9\") " Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.247683 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b5520af-e860-4937-af9c-049b304c0cf9-catalog-content\") pod \"4b5520af-e860-4937-af9c-049b304c0cf9\" (UID: \"4b5520af-e860-4937-af9c-049b304c0cf9\") " Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.247942 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b5520af-e860-4937-af9c-049b304c0cf9-utilities\") pod \"4b5520af-e860-4937-af9c-049b304c0cf9\" (UID: \"4b5520af-e860-4937-af9c-049b304c0cf9\") " Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.249496 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b5520af-e860-4937-af9c-049b304c0cf9-utilities" (OuterVolumeSpecName: "utilities") pod "4b5520af-e860-4937-af9c-049b304c0cf9" (UID: "4b5520af-e860-4937-af9c-049b304c0cf9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.253486 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b5520af-e860-4937-af9c-049b304c0cf9-kube-api-access-bhqb5" (OuterVolumeSpecName: "kube-api-access-bhqb5") pod "4b5520af-e860-4937-af9c-049b304c0cf9" (UID: "4b5520af-e860-4937-af9c-049b304c0cf9"). InnerVolumeSpecName "kube-api-access-bhqb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.271355 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b5520af-e860-4937-af9c-049b304c0cf9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4b5520af-e860-4937-af9c-049b304c0cf9" (UID: "4b5520af-e860-4937-af9c-049b304c0cf9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.350544 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b5520af-e860-4937-af9c-049b304c0cf9-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.350760 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhqb5\" (UniqueName: \"kubernetes.io/projected/4b5520af-e860-4937-af9c-049b304c0cf9-kube-api-access-bhqb5\") on node \"crc\" DevicePath \"\"" Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.350873 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b5520af-e860-4937-af9c-049b304c0cf9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.551508 4804 generic.go:334] "Generic (PLEG): container finished" podID="4b5520af-e860-4937-af9c-049b304c0cf9" containerID="905e5dc8ee2e00573877efc055742673aa8e1bf7d562cb7634ae4819ea807bf9" exitCode=0 Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.551552 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nb9wd" event={"ID":"4b5520af-e860-4937-af9c-049b304c0cf9","Type":"ContainerDied","Data":"905e5dc8ee2e00573877efc055742673aa8e1bf7d562cb7634ae4819ea807bf9"} Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.551576 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nb9wd" event={"ID":"4b5520af-e860-4937-af9c-049b304c0cf9","Type":"ContainerDied","Data":"5096b0ade58765cbb70c123fde8ddf796f5301f72982d1f2729abe092a910d91"} Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.551590 4804 scope.go:117] "RemoveContainer" containerID="905e5dc8ee2e00573877efc055742673aa8e1bf7d562cb7634ae4819ea807bf9" Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.551709 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nb9wd" Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.588932 4804 scope.go:117] "RemoveContainer" containerID="a67e0fbe664a402c8f0dfc28851d9a83b3e1980152c4e8c2c7183643829b58b4" Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.590036 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nb9wd"] Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.623429 4804 scope.go:117] "RemoveContainer" containerID="1b7acd23a0c9385c3b2b090c72ed4c69fd030cf041066eb9a7870d78777b66c0" Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.639676 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nb9wd"] Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.686360 4804 scope.go:117] "RemoveContainer" containerID="905e5dc8ee2e00573877efc055742673aa8e1bf7d562cb7634ae4819ea807bf9" Feb 17 13:51:40 crc kubenswrapper[4804]: E0217 13:51:40.695380 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"905e5dc8ee2e00573877efc055742673aa8e1bf7d562cb7634ae4819ea807bf9\": container with ID starting with 905e5dc8ee2e00573877efc055742673aa8e1bf7d562cb7634ae4819ea807bf9 not found: ID does not exist" containerID="905e5dc8ee2e00573877efc055742673aa8e1bf7d562cb7634ae4819ea807bf9" Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.695432 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"905e5dc8ee2e00573877efc055742673aa8e1bf7d562cb7634ae4819ea807bf9"} err="failed to get container status \"905e5dc8ee2e00573877efc055742673aa8e1bf7d562cb7634ae4819ea807bf9\": rpc error: code = NotFound desc = could not find container \"905e5dc8ee2e00573877efc055742673aa8e1bf7d562cb7634ae4819ea807bf9\": container with ID starting with 905e5dc8ee2e00573877efc055742673aa8e1bf7d562cb7634ae4819ea807bf9 not found: ID does not exist" Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.695465 4804 scope.go:117] "RemoveContainer" containerID="a67e0fbe664a402c8f0dfc28851d9a83b3e1980152c4e8c2c7183643829b58b4" Feb 17 13:51:40 crc kubenswrapper[4804]: E0217 13:51:40.699348 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a67e0fbe664a402c8f0dfc28851d9a83b3e1980152c4e8c2c7183643829b58b4\": container with ID starting with a67e0fbe664a402c8f0dfc28851d9a83b3e1980152c4e8c2c7183643829b58b4 not found: ID does not exist" containerID="a67e0fbe664a402c8f0dfc28851d9a83b3e1980152c4e8c2c7183643829b58b4" Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.699393 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a67e0fbe664a402c8f0dfc28851d9a83b3e1980152c4e8c2c7183643829b58b4"} err="failed to get container status \"a67e0fbe664a402c8f0dfc28851d9a83b3e1980152c4e8c2c7183643829b58b4\": rpc error: code = NotFound desc = could not find container \"a67e0fbe664a402c8f0dfc28851d9a83b3e1980152c4e8c2c7183643829b58b4\": container with ID starting with a67e0fbe664a402c8f0dfc28851d9a83b3e1980152c4e8c2c7183643829b58b4 not found: ID does not exist" Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.699418 4804 scope.go:117] "RemoveContainer" containerID="1b7acd23a0c9385c3b2b090c72ed4c69fd030cf041066eb9a7870d78777b66c0" Feb 17 13:51:40 crc kubenswrapper[4804]: E0217 13:51:40.710405 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b7acd23a0c9385c3b2b090c72ed4c69fd030cf041066eb9a7870d78777b66c0\": container with ID starting with 1b7acd23a0c9385c3b2b090c72ed4c69fd030cf041066eb9a7870d78777b66c0 not found: ID does not exist" containerID="1b7acd23a0c9385c3b2b090c72ed4c69fd030cf041066eb9a7870d78777b66c0" Feb 17 13:51:40 crc kubenswrapper[4804]: I0217 13:51:40.710458 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b7acd23a0c9385c3b2b090c72ed4c69fd030cf041066eb9a7870d78777b66c0"} err="failed to get container status \"1b7acd23a0c9385c3b2b090c72ed4c69fd030cf041066eb9a7870d78777b66c0\": rpc error: code = NotFound desc = could not find container \"1b7acd23a0c9385c3b2b090c72ed4c69fd030cf041066eb9a7870d78777b66c0\": container with ID starting with 1b7acd23a0c9385c3b2b090c72ed4c69fd030cf041066eb9a7870d78777b66c0 not found: ID does not exist" Feb 17 13:51:42 crc kubenswrapper[4804]: I0217 13:51:42.584421 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b5520af-e860-4937-af9c-049b304c0cf9" path="/var/lib/kubelet/pods/4b5520af-e860-4937-af9c-049b304c0cf9/volumes" Feb 17 13:51:59 crc kubenswrapper[4804]: I0217 13:51:59.249163 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d49qr"] Feb 17 13:51:59 crc kubenswrapper[4804]: E0217 13:51:59.251477 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b5520af-e860-4937-af9c-049b304c0cf9" containerName="registry-server" Feb 17 13:51:59 crc kubenswrapper[4804]: I0217 13:51:59.251584 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b5520af-e860-4937-af9c-049b304c0cf9" containerName="registry-server" Feb 17 13:51:59 crc kubenswrapper[4804]: E0217 13:51:59.251671 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b5520af-e860-4937-af9c-049b304c0cf9" containerName="extract-content" Feb 17 13:51:59 crc kubenswrapper[4804]: I0217 13:51:59.251742 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b5520af-e860-4937-af9c-049b304c0cf9" containerName="extract-content" Feb 17 13:51:59 crc kubenswrapper[4804]: E0217 13:51:59.251838 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b5520af-e860-4937-af9c-049b304c0cf9" containerName="extract-utilities" Feb 17 13:51:59 crc kubenswrapper[4804]: I0217 13:51:59.251907 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b5520af-e860-4937-af9c-049b304c0cf9" containerName="extract-utilities" Feb 17 13:51:59 crc kubenswrapper[4804]: I0217 13:51:59.252188 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b5520af-e860-4937-af9c-049b304c0cf9" containerName="registry-server" Feb 17 13:51:59 crc kubenswrapper[4804]: I0217 13:51:59.254271 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d49qr" Feb 17 13:51:59 crc kubenswrapper[4804]: I0217 13:51:59.281444 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d49qr"] Feb 17 13:51:59 crc kubenswrapper[4804]: I0217 13:51:59.339243 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b31d2b3-4599-40a8-b1c0-3f0f795cd13b-utilities\") pod \"community-operators-d49qr\" (UID: \"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b\") " pod="openshift-marketplace/community-operators-d49qr" Feb 17 13:51:59 crc kubenswrapper[4804]: I0217 13:51:59.339540 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b31d2b3-4599-40a8-b1c0-3f0f795cd13b-catalog-content\") pod \"community-operators-d49qr\" (UID: \"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b\") " pod="openshift-marketplace/community-operators-d49qr" Feb 17 13:51:59 crc kubenswrapper[4804]: I0217 13:51:59.339687 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxrbm\" (UniqueName: \"kubernetes.io/projected/8b31d2b3-4599-40a8-b1c0-3f0f795cd13b-kube-api-access-wxrbm\") pod \"community-operators-d49qr\" (UID: \"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b\") " pod="openshift-marketplace/community-operators-d49qr" Feb 17 13:51:59 crc kubenswrapper[4804]: I0217 13:51:59.442241 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxrbm\" (UniqueName: \"kubernetes.io/projected/8b31d2b3-4599-40a8-b1c0-3f0f795cd13b-kube-api-access-wxrbm\") pod \"community-operators-d49qr\" (UID: \"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b\") " pod="openshift-marketplace/community-operators-d49qr" Feb 17 13:51:59 crc kubenswrapper[4804]: I0217 13:51:59.442398 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b31d2b3-4599-40a8-b1c0-3f0f795cd13b-utilities\") pod \"community-operators-d49qr\" (UID: \"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b\") " pod="openshift-marketplace/community-operators-d49qr" Feb 17 13:51:59 crc kubenswrapper[4804]: I0217 13:51:59.442492 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b31d2b3-4599-40a8-b1c0-3f0f795cd13b-catalog-content\") pod \"community-operators-d49qr\" (UID: \"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b\") " pod="openshift-marketplace/community-operators-d49qr" Feb 17 13:51:59 crc kubenswrapper[4804]: I0217 13:51:59.442996 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b31d2b3-4599-40a8-b1c0-3f0f795cd13b-utilities\") pod \"community-operators-d49qr\" (UID: \"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b\") " pod="openshift-marketplace/community-operators-d49qr" Feb 17 13:51:59 crc kubenswrapper[4804]: I0217 13:51:59.443410 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b31d2b3-4599-40a8-b1c0-3f0f795cd13b-catalog-content\") pod \"community-operators-d49qr\" (UID: \"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b\") " pod="openshift-marketplace/community-operators-d49qr" Feb 17 13:51:59 crc kubenswrapper[4804]: I0217 13:51:59.467436 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxrbm\" (UniqueName: \"kubernetes.io/projected/8b31d2b3-4599-40a8-b1c0-3f0f795cd13b-kube-api-access-wxrbm\") pod \"community-operators-d49qr\" (UID: \"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b\") " pod="openshift-marketplace/community-operators-d49qr" Feb 17 13:51:59 crc kubenswrapper[4804]: I0217 13:51:59.581899 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d49qr" Feb 17 13:52:00 crc kubenswrapper[4804]: I0217 13:52:00.297642 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d49qr"] Feb 17 13:52:00 crc kubenswrapper[4804]: I0217 13:52:00.755178 4804 generic.go:334] "Generic (PLEG): container finished" podID="8b31d2b3-4599-40a8-b1c0-3f0f795cd13b" containerID="80882e9d11ff70b7f4229ea7891e6b71a03c35cd89308d5ce5725493e12e7420" exitCode=0 Feb 17 13:52:00 crc kubenswrapper[4804]: I0217 13:52:00.755254 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d49qr" event={"ID":"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b","Type":"ContainerDied","Data":"80882e9d11ff70b7f4229ea7891e6b71a03c35cd89308d5ce5725493e12e7420"} Feb 17 13:52:00 crc kubenswrapper[4804]: I0217 13:52:00.755290 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d49qr" event={"ID":"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b","Type":"ContainerStarted","Data":"ad252798f89f9f5003c362f37cb3de655136fdb2a16e7eaa3681ff60f9f272d2"} Feb 17 13:52:02 crc kubenswrapper[4804]: I0217 13:52:02.784987 4804 generic.go:334] "Generic (PLEG): container finished" podID="8b31d2b3-4599-40a8-b1c0-3f0f795cd13b" containerID="c52aef2a5b0f26c58811f5709c4e506ccb93b90fb40970df9641fe23a08e0c31" exitCode=0 Feb 17 13:52:02 crc kubenswrapper[4804]: I0217 13:52:02.785186 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d49qr" event={"ID":"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b","Type":"ContainerDied","Data":"c52aef2a5b0f26c58811f5709c4e506ccb93b90fb40970df9641fe23a08e0c31"} Feb 17 13:52:03 crc kubenswrapper[4804]: I0217 13:52:03.797092 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d49qr" event={"ID":"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b","Type":"ContainerStarted","Data":"b1c95098baf87d60dddfa398d407b560dc9dfc88f9bc0afa2a6424daa3df5b05"} Feb 17 13:52:03 crc kubenswrapper[4804]: I0217 13:52:03.816427 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d49qr" podStartSLOduration=2.351987312 podStartE2EDuration="4.816402136s" podCreationTimestamp="2026-02-17 13:51:59 +0000 UTC" firstStartedPulling="2026-02-17 13:52:00.757067072 +0000 UTC m=+1594.868486409" lastFinishedPulling="2026-02-17 13:52:03.221481906 +0000 UTC m=+1597.332901233" observedRunningTime="2026-02-17 13:52:03.812276947 +0000 UTC m=+1597.923696274" watchObservedRunningTime="2026-02-17 13:52:03.816402136 +0000 UTC m=+1597.927821473" Feb 17 13:52:09 crc kubenswrapper[4804]: I0217 13:52:09.584506 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d49qr" Feb 17 13:52:09 crc kubenswrapper[4804]: I0217 13:52:09.585025 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d49qr" Feb 17 13:52:09 crc kubenswrapper[4804]: I0217 13:52:09.632491 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d49qr" Feb 17 13:52:09 crc kubenswrapper[4804]: I0217 13:52:09.897185 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d49qr" Feb 17 13:52:10 crc kubenswrapper[4804]: I0217 13:52:10.003922 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d49qr"] Feb 17 13:52:11 crc kubenswrapper[4804]: I0217 13:52:11.868310 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-d49qr" podUID="8b31d2b3-4599-40a8-b1c0-3f0f795cd13b" containerName="registry-server" containerID="cri-o://b1c95098baf87d60dddfa398d407b560dc9dfc88f9bc0afa2a6424daa3df5b05" gracePeriod=2 Feb 17 13:52:12 crc kubenswrapper[4804]: E0217 13:52:12.131490 4804 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b31d2b3_4599_40a8_b1c0_3f0f795cd13b.slice/crio-b1c95098baf87d60dddfa398d407b560dc9dfc88f9bc0afa2a6424daa3df5b05.scope\": RecentStats: unable to find data in memory cache]" Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.314938 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d49qr" Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.406700 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b31d2b3-4599-40a8-b1c0-3f0f795cd13b-utilities\") pod \"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b\" (UID: \"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b\") " Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.406915 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxrbm\" (UniqueName: \"kubernetes.io/projected/8b31d2b3-4599-40a8-b1c0-3f0f795cd13b-kube-api-access-wxrbm\") pod \"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b\" (UID: \"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b\") " Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.406998 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b31d2b3-4599-40a8-b1c0-3f0f795cd13b-catalog-content\") pod \"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b\" (UID: \"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b\") " Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.407930 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b31d2b3-4599-40a8-b1c0-3f0f795cd13b-utilities" (OuterVolumeSpecName: "utilities") pod "8b31d2b3-4599-40a8-b1c0-3f0f795cd13b" (UID: "8b31d2b3-4599-40a8-b1c0-3f0f795cd13b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.417161 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b31d2b3-4599-40a8-b1c0-3f0f795cd13b-kube-api-access-wxrbm" (OuterVolumeSpecName: "kube-api-access-wxrbm") pod "8b31d2b3-4599-40a8-b1c0-3f0f795cd13b" (UID: "8b31d2b3-4599-40a8-b1c0-3f0f795cd13b"). InnerVolumeSpecName "kube-api-access-wxrbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.454456 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b31d2b3-4599-40a8-b1c0-3f0f795cd13b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8b31d2b3-4599-40a8-b1c0-3f0f795cd13b" (UID: "8b31d2b3-4599-40a8-b1c0-3f0f795cd13b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.509246 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxrbm\" (UniqueName: \"kubernetes.io/projected/8b31d2b3-4599-40a8-b1c0-3f0f795cd13b-kube-api-access-wxrbm\") on node \"crc\" DevicePath \"\"" Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.509285 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b31d2b3-4599-40a8-b1c0-3f0f795cd13b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.509295 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b31d2b3-4599-40a8-b1c0-3f0f795cd13b-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.878831 4804 generic.go:334] "Generic (PLEG): container finished" podID="8b31d2b3-4599-40a8-b1c0-3f0f795cd13b" containerID="b1c95098baf87d60dddfa398d407b560dc9dfc88f9bc0afa2a6424daa3df5b05" exitCode=0 Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.878874 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d49qr" event={"ID":"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b","Type":"ContainerDied","Data":"b1c95098baf87d60dddfa398d407b560dc9dfc88f9bc0afa2a6424daa3df5b05"} Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.878899 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d49qr" event={"ID":"8b31d2b3-4599-40a8-b1c0-3f0f795cd13b","Type":"ContainerDied","Data":"ad252798f89f9f5003c362f37cb3de655136fdb2a16e7eaa3681ff60f9f272d2"} Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.878915 4804 scope.go:117] "RemoveContainer" containerID="b1c95098baf87d60dddfa398d407b560dc9dfc88f9bc0afa2a6424daa3df5b05" Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.878931 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d49qr" Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.903792 4804 scope.go:117] "RemoveContainer" containerID="c52aef2a5b0f26c58811f5709c4e506ccb93b90fb40970df9641fe23a08e0c31" Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.912244 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d49qr"] Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.931155 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-d49qr"] Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.931214 4804 scope.go:117] "RemoveContainer" containerID="80882e9d11ff70b7f4229ea7891e6b71a03c35cd89308d5ce5725493e12e7420" Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.968476 4804 scope.go:117] "RemoveContainer" containerID="b1c95098baf87d60dddfa398d407b560dc9dfc88f9bc0afa2a6424daa3df5b05" Feb 17 13:52:12 crc kubenswrapper[4804]: E0217 13:52:12.969811 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1c95098baf87d60dddfa398d407b560dc9dfc88f9bc0afa2a6424daa3df5b05\": container with ID starting with b1c95098baf87d60dddfa398d407b560dc9dfc88f9bc0afa2a6424daa3df5b05 not found: ID does not exist" containerID="b1c95098baf87d60dddfa398d407b560dc9dfc88f9bc0afa2a6424daa3df5b05" Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.969850 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1c95098baf87d60dddfa398d407b560dc9dfc88f9bc0afa2a6424daa3df5b05"} err="failed to get container status \"b1c95098baf87d60dddfa398d407b560dc9dfc88f9bc0afa2a6424daa3df5b05\": rpc error: code = NotFound desc = could not find container \"b1c95098baf87d60dddfa398d407b560dc9dfc88f9bc0afa2a6424daa3df5b05\": container with ID starting with b1c95098baf87d60dddfa398d407b560dc9dfc88f9bc0afa2a6424daa3df5b05 not found: ID does not exist" Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.969876 4804 scope.go:117] "RemoveContainer" containerID="c52aef2a5b0f26c58811f5709c4e506ccb93b90fb40970df9641fe23a08e0c31" Feb 17 13:52:12 crc kubenswrapper[4804]: E0217 13:52:12.970313 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c52aef2a5b0f26c58811f5709c4e506ccb93b90fb40970df9641fe23a08e0c31\": container with ID starting with c52aef2a5b0f26c58811f5709c4e506ccb93b90fb40970df9641fe23a08e0c31 not found: ID does not exist" containerID="c52aef2a5b0f26c58811f5709c4e506ccb93b90fb40970df9641fe23a08e0c31" Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.970354 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c52aef2a5b0f26c58811f5709c4e506ccb93b90fb40970df9641fe23a08e0c31"} err="failed to get container status \"c52aef2a5b0f26c58811f5709c4e506ccb93b90fb40970df9641fe23a08e0c31\": rpc error: code = NotFound desc = could not find container \"c52aef2a5b0f26c58811f5709c4e506ccb93b90fb40970df9641fe23a08e0c31\": container with ID starting with c52aef2a5b0f26c58811f5709c4e506ccb93b90fb40970df9641fe23a08e0c31 not found: ID does not exist" Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.970380 4804 scope.go:117] "RemoveContainer" containerID="80882e9d11ff70b7f4229ea7891e6b71a03c35cd89308d5ce5725493e12e7420" Feb 17 13:52:12 crc kubenswrapper[4804]: E0217 13:52:12.970736 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80882e9d11ff70b7f4229ea7891e6b71a03c35cd89308d5ce5725493e12e7420\": container with ID starting with 80882e9d11ff70b7f4229ea7891e6b71a03c35cd89308d5ce5725493e12e7420 not found: ID does not exist" containerID="80882e9d11ff70b7f4229ea7891e6b71a03c35cd89308d5ce5725493e12e7420" Feb 17 13:52:12 crc kubenswrapper[4804]: I0217 13:52:12.970763 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80882e9d11ff70b7f4229ea7891e6b71a03c35cd89308d5ce5725493e12e7420"} err="failed to get container status \"80882e9d11ff70b7f4229ea7891e6b71a03c35cd89308d5ce5725493e12e7420\": rpc error: code = NotFound desc = could not find container \"80882e9d11ff70b7f4229ea7891e6b71a03c35cd89308d5ce5725493e12e7420\": container with ID starting with 80882e9d11ff70b7f4229ea7891e6b71a03c35cd89308d5ce5725493e12e7420 not found: ID does not exist" Feb 17 13:52:14 crc kubenswrapper[4804]: I0217 13:52:14.585961 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b31d2b3-4599-40a8-b1c0-3f0f795cd13b" path="/var/lib/kubelet/pods/8b31d2b3-4599-40a8-b1c0-3f0f795cd13b/volumes" Feb 17 13:52:25 crc kubenswrapper[4804]: I0217 13:52:25.835453 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:52:25 crc kubenswrapper[4804]: I0217 13:52:25.835864 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:52:28 crc kubenswrapper[4804]: I0217 13:52:28.980663 4804 scope.go:117] "RemoveContainer" containerID="fe58294f85ff06a0d32971760c88b3a7d0ebe711d822c93e180307f22e74f6a0" Feb 17 13:52:29 crc kubenswrapper[4804]: I0217 13:52:29.014601 4804 scope.go:117] "RemoveContainer" containerID="dca62d14dda6868e926b57148e8cd74b64e632384abd99e1788d3d27c22c4765" Feb 17 13:52:29 crc kubenswrapper[4804]: I0217 13:52:29.056668 4804 scope.go:117] "RemoveContainer" containerID="c204297abebd9a53145ab03c24cc8848ddb7478ea7164daa834f5efc7f82083d" Feb 17 13:52:29 crc kubenswrapper[4804]: I0217 13:52:29.082931 4804 scope.go:117] "RemoveContainer" containerID="525c6762f9ba8180d2f6b437538441d8677513d8b708766e65a25901daeb816c" Feb 17 13:52:29 crc kubenswrapper[4804]: I0217 13:52:29.139727 4804 scope.go:117] "RemoveContainer" containerID="b7047f4fb5cc51bf92eedf0304d4d9a035247692d44844e1d6c89de23d58aef4" Feb 17 13:52:55 crc kubenswrapper[4804]: I0217 13:52:55.835804 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:52:55 crc kubenswrapper[4804]: I0217 13:52:55.836445 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:53:25 crc kubenswrapper[4804]: I0217 13:53:25.835756 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 13:53:25 crc kubenswrapper[4804]: I0217 13:53:25.837859 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 13:53:25 crc kubenswrapper[4804]: I0217 13:53:25.838428 4804 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 13:53:25 crc kubenswrapper[4804]: I0217 13:53:25.839682 4804 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687"} pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 13:53:25 crc kubenswrapper[4804]: I0217 13:53:25.839997 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" containerID="cri-o://174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" gracePeriod=600 Feb 17 13:53:25 crc kubenswrapper[4804]: E0217 13:53:25.967517 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:53:26 crc kubenswrapper[4804]: I0217 13:53:26.618935 4804 generic.go:334] "Generic (PLEG): container finished" podID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" exitCode=0 Feb 17 13:53:26 crc kubenswrapper[4804]: I0217 13:53:26.619024 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerDied","Data":"174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687"} Feb 17 13:53:26 crc kubenswrapper[4804]: I0217 13:53:26.619337 4804 scope.go:117] "RemoveContainer" containerID="845afb2d1d32c8f1c4420bf9c6d30ae92d7fd53810dea6b094c1e266f88044e6" Feb 17 13:53:26 crc kubenswrapper[4804]: I0217 13:53:26.620027 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:53:26 crc kubenswrapper[4804]: E0217 13:53:26.620356 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:53:29 crc kubenswrapper[4804]: I0217 13:53:29.301535 4804 scope.go:117] "RemoveContainer" containerID="723476fd1d8f467255808440fe7e8799143ee2007a7f138345fcc04e2663bf99" Feb 17 13:53:29 crc kubenswrapper[4804]: I0217 13:53:29.331751 4804 scope.go:117] "RemoveContainer" containerID="2b63a870c70f085dee0bf900b7beba65015e3aff6e6541b29544712e34dd77a9" Feb 17 13:53:29 crc kubenswrapper[4804]: I0217 13:53:29.360924 4804 scope.go:117] "RemoveContainer" containerID="8d6d4b8225dc05b2f8ac6fe66b04d57f0e324f2f754fb6ddc82de82d73688709" Feb 17 13:53:41 crc kubenswrapper[4804]: I0217 13:53:41.575503 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:53:41 crc kubenswrapper[4804]: E0217 13:53:41.576416 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:53:55 crc kubenswrapper[4804]: I0217 13:53:55.574819 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:53:55 crc kubenswrapper[4804]: E0217 13:53:55.575609 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:54:09 crc kubenswrapper[4804]: I0217 13:54:09.574184 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:54:09 crc kubenswrapper[4804]: E0217 13:54:09.574828 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:54:22 crc kubenswrapper[4804]: I0217 13:54:22.574876 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:54:22 crc kubenswrapper[4804]: E0217 13:54:22.576318 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:54:30 crc kubenswrapper[4804]: I0217 13:54:30.256655 4804 generic.go:334] "Generic (PLEG): container finished" podID="9ee075c2-2363-4446-8545-dfdece6ca4da" containerID="c30c97b714db6eaaea3d99e426020e3d5b0cd168a7762b36fc6e65e7574bc11f" exitCode=0 Feb 17 13:54:30 crc kubenswrapper[4804]: I0217 13:54:30.256729 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" event={"ID":"9ee075c2-2363-4446-8545-dfdece6ca4da","Type":"ContainerDied","Data":"c30c97b714db6eaaea3d99e426020e3d5b0cd168a7762b36fc6e65e7574bc11f"} Feb 17 13:54:31 crc kubenswrapper[4804]: I0217 13:54:31.773846 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" Feb 17 13:54:31 crc kubenswrapper[4804]: I0217 13:54:31.937946 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpnt6\" (UniqueName: \"kubernetes.io/projected/9ee075c2-2363-4446-8545-dfdece6ca4da-kube-api-access-bpnt6\") pod \"9ee075c2-2363-4446-8545-dfdece6ca4da\" (UID: \"9ee075c2-2363-4446-8545-dfdece6ca4da\") " Feb 17 13:54:31 crc kubenswrapper[4804]: I0217 13:54:31.938182 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ee075c2-2363-4446-8545-dfdece6ca4da-bootstrap-combined-ca-bundle\") pod \"9ee075c2-2363-4446-8545-dfdece6ca4da\" (UID: \"9ee075c2-2363-4446-8545-dfdece6ca4da\") " Feb 17 13:54:31 crc kubenswrapper[4804]: I0217 13:54:31.938417 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ee075c2-2363-4446-8545-dfdece6ca4da-ssh-key-openstack-edpm-ipam\") pod \"9ee075c2-2363-4446-8545-dfdece6ca4da\" (UID: \"9ee075c2-2363-4446-8545-dfdece6ca4da\") " Feb 17 13:54:31 crc kubenswrapper[4804]: I0217 13:54:31.938479 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ee075c2-2363-4446-8545-dfdece6ca4da-inventory\") pod \"9ee075c2-2363-4446-8545-dfdece6ca4da\" (UID: \"9ee075c2-2363-4446-8545-dfdece6ca4da\") " Feb 17 13:54:31 crc kubenswrapper[4804]: I0217 13:54:31.944865 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ee075c2-2363-4446-8545-dfdece6ca4da-kube-api-access-bpnt6" (OuterVolumeSpecName: "kube-api-access-bpnt6") pod "9ee075c2-2363-4446-8545-dfdece6ca4da" (UID: "9ee075c2-2363-4446-8545-dfdece6ca4da"). InnerVolumeSpecName "kube-api-access-bpnt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:54:31 crc kubenswrapper[4804]: I0217 13:54:31.947503 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ee075c2-2363-4446-8545-dfdece6ca4da-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "9ee075c2-2363-4446-8545-dfdece6ca4da" (UID: "9ee075c2-2363-4446-8545-dfdece6ca4da"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:54:31 crc kubenswrapper[4804]: I0217 13:54:31.966448 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ee075c2-2363-4446-8545-dfdece6ca4da-inventory" (OuterVolumeSpecName: "inventory") pod "9ee075c2-2363-4446-8545-dfdece6ca4da" (UID: "9ee075c2-2363-4446-8545-dfdece6ca4da"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:54:31 crc kubenswrapper[4804]: I0217 13:54:31.971067 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ee075c2-2363-4446-8545-dfdece6ca4da-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9ee075c2-2363-4446-8545-dfdece6ca4da" (UID: "9ee075c2-2363-4446-8545-dfdece6ca4da"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.040848 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpnt6\" (UniqueName: \"kubernetes.io/projected/9ee075c2-2363-4446-8545-dfdece6ca4da-kube-api-access-bpnt6\") on node \"crc\" DevicePath \"\"" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.041505 4804 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ee075c2-2363-4446-8545-dfdece6ca4da-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.041526 4804 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ee075c2-2363-4446-8545-dfdece6ca4da-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.041536 4804 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ee075c2-2363-4446-8545-dfdece6ca4da-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.302029 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" event={"ID":"9ee075c2-2363-4446-8545-dfdece6ca4da","Type":"ContainerDied","Data":"18b9b9e278e8c83dcf92ce23e64cdae2695221512b4169c9866fb3e6753d918e"} Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.302069 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18b9b9e278e8c83dcf92ce23e64cdae2695221512b4169c9866fb3e6753d918e" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.302113 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.463816 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc"] Feb 17 13:54:32 crc kubenswrapper[4804]: E0217 13:54:32.464227 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b31d2b3-4599-40a8-b1c0-3f0f795cd13b" containerName="extract-utilities" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.464242 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b31d2b3-4599-40a8-b1c0-3f0f795cd13b" containerName="extract-utilities" Feb 17 13:54:32 crc kubenswrapper[4804]: E0217 13:54:32.464257 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b31d2b3-4599-40a8-b1c0-3f0f795cd13b" containerName="registry-server" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.464264 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b31d2b3-4599-40a8-b1c0-3f0f795cd13b" containerName="registry-server" Feb 17 13:54:32 crc kubenswrapper[4804]: E0217 13:54:32.464278 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ee075c2-2363-4446-8545-dfdece6ca4da" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.464285 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ee075c2-2363-4446-8545-dfdece6ca4da" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 17 13:54:32 crc kubenswrapper[4804]: E0217 13:54:32.464301 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b31d2b3-4599-40a8-b1c0-3f0f795cd13b" containerName="extract-content" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.464306 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b31d2b3-4599-40a8-b1c0-3f0f795cd13b" containerName="extract-content" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.464472 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ee075c2-2363-4446-8545-dfdece6ca4da" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.464485 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b31d2b3-4599-40a8-b1c0-3f0f795cd13b" containerName="registry-server" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.465035 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.467491 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.468162 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.468365 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.468520 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-29gqz" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.480519 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc"] Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.651848 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ecc3e55-21c0-4017-8dce-9c77fd2189ea-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc\" (UID: \"5ecc3e55-21c0-4017-8dce-9c77fd2189ea\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.651897 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdrg5\" (UniqueName: \"kubernetes.io/projected/5ecc3e55-21c0-4017-8dce-9c77fd2189ea-kube-api-access-mdrg5\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc\" (UID: \"5ecc3e55-21c0-4017-8dce-9c77fd2189ea\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.652678 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5ecc3e55-21c0-4017-8dce-9c77fd2189ea-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc\" (UID: \"5ecc3e55-21c0-4017-8dce-9c77fd2189ea\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.756012 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ecc3e55-21c0-4017-8dce-9c77fd2189ea-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc\" (UID: \"5ecc3e55-21c0-4017-8dce-9c77fd2189ea\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.756062 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdrg5\" (UniqueName: \"kubernetes.io/projected/5ecc3e55-21c0-4017-8dce-9c77fd2189ea-kube-api-access-mdrg5\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc\" (UID: \"5ecc3e55-21c0-4017-8dce-9c77fd2189ea\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.756112 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5ecc3e55-21c0-4017-8dce-9c77fd2189ea-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc\" (UID: \"5ecc3e55-21c0-4017-8dce-9c77fd2189ea\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.760599 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5ecc3e55-21c0-4017-8dce-9c77fd2189ea-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc\" (UID: \"5ecc3e55-21c0-4017-8dce-9c77fd2189ea\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.760625 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ecc3e55-21c0-4017-8dce-9c77fd2189ea-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc\" (UID: \"5ecc3e55-21c0-4017-8dce-9c77fd2189ea\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.779950 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdrg5\" (UniqueName: \"kubernetes.io/projected/5ecc3e55-21c0-4017-8dce-9c77fd2189ea-kube-api-access-mdrg5\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc\" (UID: \"5ecc3e55-21c0-4017-8dce-9c77fd2189ea\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc" Feb 17 13:54:32 crc kubenswrapper[4804]: I0217 13:54:32.783925 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc" Feb 17 13:54:33 crc kubenswrapper[4804]: I0217 13:54:33.282333 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc"] Feb 17 13:54:33 crc kubenswrapper[4804]: I0217 13:54:33.312971 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc" event={"ID":"5ecc3e55-21c0-4017-8dce-9c77fd2189ea","Type":"ContainerStarted","Data":"8403bf1cd856d5e0daf8826b7d963b23d0c64b1af10a42c915eb4e8853a3f40b"} Feb 17 13:54:34 crc kubenswrapper[4804]: I0217 13:54:34.332281 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc" event={"ID":"5ecc3e55-21c0-4017-8dce-9c77fd2189ea","Type":"ContainerStarted","Data":"c179411b8526961212962ec76e7aa2e295a2ad91f22528c3d54a0da09b716dc4"} Feb 17 13:54:34 crc kubenswrapper[4804]: I0217 13:54:34.362894 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc" podStartSLOduration=2.177384309 podStartE2EDuration="2.362865706s" podCreationTimestamp="2026-02-17 13:54:32 +0000 UTC" firstStartedPulling="2026-02-17 13:54:33.288236524 +0000 UTC m=+1747.399655861" lastFinishedPulling="2026-02-17 13:54:33.473717901 +0000 UTC m=+1747.585137258" observedRunningTime="2026-02-17 13:54:34.355434482 +0000 UTC m=+1748.466853849" watchObservedRunningTime="2026-02-17 13:54:34.362865706 +0000 UTC m=+1748.474285073" Feb 17 13:54:35 crc kubenswrapper[4804]: I0217 13:54:35.573628 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:54:35 crc kubenswrapper[4804]: E0217 13:54:35.574159 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:54:49 crc kubenswrapper[4804]: I0217 13:54:49.574246 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:54:49 crc kubenswrapper[4804]: E0217 13:54:49.574962 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:55:01 crc kubenswrapper[4804]: I0217 13:55:01.574337 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:55:01 crc kubenswrapper[4804]: E0217 13:55:01.575110 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:55:14 crc kubenswrapper[4804]: I0217 13:55:14.574440 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:55:14 crc kubenswrapper[4804]: E0217 13:55:14.575639 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:55:26 crc kubenswrapper[4804]: I0217 13:55:26.580902 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:55:26 crc kubenswrapper[4804]: E0217 13:55:26.581623 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:55:37 crc kubenswrapper[4804]: I0217 13:55:37.574727 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:55:37 crc kubenswrapper[4804]: E0217 13:55:37.575579 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:55:39 crc kubenswrapper[4804]: I0217 13:55:39.052107 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-dl5b9"] Feb 17 13:55:39 crc kubenswrapper[4804]: I0217 13:55:39.063619 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-886b-account-create-update-h84mx"] Feb 17 13:55:39 crc kubenswrapper[4804]: I0217 13:55:39.071895 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-0898-account-create-update-6vpd7"] Feb 17 13:55:39 crc kubenswrapper[4804]: I0217 13:55:39.079806 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-886b-account-create-update-h84mx"] Feb 17 13:55:39 crc kubenswrapper[4804]: I0217 13:55:39.088035 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-0898-account-create-update-6vpd7"] Feb 17 13:55:39 crc kubenswrapper[4804]: I0217 13:55:39.096346 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-6m6pk"] Feb 17 13:55:39 crc kubenswrapper[4804]: I0217 13:55:39.105032 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-dl5b9"] Feb 17 13:55:39 crc kubenswrapper[4804]: I0217 13:55:39.114292 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-6m6pk"] Feb 17 13:55:40 crc kubenswrapper[4804]: I0217 13:55:40.583824 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2edd89a7-0866-4677-8b25-9654130c6ac5" path="/var/lib/kubelet/pods/2edd89a7-0866-4677-8b25-9654130c6ac5/volumes" Feb 17 13:55:40 crc kubenswrapper[4804]: I0217 13:55:40.584542 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bc37bd5-6784-41f8-98de-ef6a43493cd6" path="/var/lib/kubelet/pods/4bc37bd5-6784-41f8-98de-ef6a43493cd6/volumes" Feb 17 13:55:40 crc kubenswrapper[4804]: I0217 13:55:40.585021 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f9dbe9b-ced6-453d-9f59-0d92e2a69043" path="/var/lib/kubelet/pods/6f9dbe9b-ced6-453d-9f59-0d92e2a69043/volumes" Feb 17 13:55:40 crc kubenswrapper[4804]: I0217 13:55:40.585662 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba7e6539-c0c9-40e7-b076-38cc23f233cc" path="/var/lib/kubelet/pods/ba7e6539-c0c9-40e7-b076-38cc23f233cc/volumes" Feb 17 13:55:43 crc kubenswrapper[4804]: I0217 13:55:43.026500 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-f8a9-account-create-update-98wtk"] Feb 17 13:55:43 crc kubenswrapper[4804]: I0217 13:55:43.034146 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-v8tb5"] Feb 17 13:55:43 crc kubenswrapper[4804]: I0217 13:55:43.060260 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-f8a9-account-create-update-98wtk"] Feb 17 13:55:43 crc kubenswrapper[4804]: I0217 13:55:43.068994 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-v8tb5"] Feb 17 13:55:44 crc kubenswrapper[4804]: I0217 13:55:44.588693 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c8ee09a-97bd-4497-81cd-2f0f4952d996" path="/var/lib/kubelet/pods/4c8ee09a-97bd-4497-81cd-2f0f4952d996/volumes" Feb 17 13:55:44 crc kubenswrapper[4804]: I0217 13:55:44.590686 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0597f43-df0a-427f-b045-e6859849a0d6" path="/var/lib/kubelet/pods/b0597f43-df0a-427f-b045-e6859849a0d6/volumes" Feb 17 13:55:49 crc kubenswrapper[4804]: I0217 13:55:49.494505 4804 generic.go:334] "Generic (PLEG): container finished" podID="5ecc3e55-21c0-4017-8dce-9c77fd2189ea" containerID="c179411b8526961212962ec76e7aa2e295a2ad91f22528c3d54a0da09b716dc4" exitCode=0 Feb 17 13:55:49 crc kubenswrapper[4804]: I0217 13:55:49.494584 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc" event={"ID":"5ecc3e55-21c0-4017-8dce-9c77fd2189ea","Type":"ContainerDied","Data":"c179411b8526961212962ec76e7aa2e295a2ad91f22528c3d54a0da09b716dc4"} Feb 17 13:55:50 crc kubenswrapper[4804]: I0217 13:55:50.963258 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.018546 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdrg5\" (UniqueName: \"kubernetes.io/projected/5ecc3e55-21c0-4017-8dce-9c77fd2189ea-kube-api-access-mdrg5\") pod \"5ecc3e55-21c0-4017-8dce-9c77fd2189ea\" (UID: \"5ecc3e55-21c0-4017-8dce-9c77fd2189ea\") " Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.019148 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5ecc3e55-21c0-4017-8dce-9c77fd2189ea-ssh-key-openstack-edpm-ipam\") pod \"5ecc3e55-21c0-4017-8dce-9c77fd2189ea\" (UID: \"5ecc3e55-21c0-4017-8dce-9c77fd2189ea\") " Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.019448 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ecc3e55-21c0-4017-8dce-9c77fd2189ea-inventory\") pod \"5ecc3e55-21c0-4017-8dce-9c77fd2189ea\" (UID: \"5ecc3e55-21c0-4017-8dce-9c77fd2189ea\") " Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.046460 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ecc3e55-21c0-4017-8dce-9c77fd2189ea-kube-api-access-mdrg5" (OuterVolumeSpecName: "kube-api-access-mdrg5") pod "5ecc3e55-21c0-4017-8dce-9c77fd2189ea" (UID: "5ecc3e55-21c0-4017-8dce-9c77fd2189ea"). InnerVolumeSpecName "kube-api-access-mdrg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.068509 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ecc3e55-21c0-4017-8dce-9c77fd2189ea-inventory" (OuterVolumeSpecName: "inventory") pod "5ecc3e55-21c0-4017-8dce-9c77fd2189ea" (UID: "5ecc3e55-21c0-4017-8dce-9c77fd2189ea"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.091278 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-c8wmz"] Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.109568 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-c8wmz"] Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.117678 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ecc3e55-21c0-4017-8dce-9c77fd2189ea-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5ecc3e55-21c0-4017-8dce-9c77fd2189ea" (UID: "5ecc3e55-21c0-4017-8dce-9c77fd2189ea"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.121620 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdrg5\" (UniqueName: \"kubernetes.io/projected/5ecc3e55-21c0-4017-8dce-9c77fd2189ea-kube-api-access-mdrg5\") on node \"crc\" DevicePath \"\"" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.121656 4804 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5ecc3e55-21c0-4017-8dce-9c77fd2189ea-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.121668 4804 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ecc3e55-21c0-4017-8dce-9c77fd2189ea-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.519036 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc" event={"ID":"5ecc3e55-21c0-4017-8dce-9c77fd2189ea","Type":"ContainerDied","Data":"8403bf1cd856d5e0daf8826b7d963b23d0c64b1af10a42c915eb4e8853a3f40b"} Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.519079 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8403bf1cd856d5e0daf8826b7d963b23d0c64b1af10a42c915eb4e8853a3f40b" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.519140 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.649681 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq"] Feb 17 13:55:51 crc kubenswrapper[4804]: E0217 13:55:51.650446 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ecc3e55-21c0-4017-8dce-9c77fd2189ea" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.650462 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ecc3e55-21c0-4017-8dce-9c77fd2189ea" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.651052 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ecc3e55-21c0-4017-8dce-9c77fd2189ea" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.651862 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.655865 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-29gqz" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.656217 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.656346 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.656465 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.665262 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq"] Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.838672 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c4e88aa-842f-453a-9ce9-8354c16340e9-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-499xq\" (UID: \"5c4e88aa-842f-453a-9ce9-8354c16340e9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.838909 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5c4e88aa-842f-453a-9ce9-8354c16340e9-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-499xq\" (UID: \"5c4e88aa-842f-453a-9ce9-8354c16340e9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.838995 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5854r\" (UniqueName: \"kubernetes.io/projected/5c4e88aa-842f-453a-9ce9-8354c16340e9-kube-api-access-5854r\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-499xq\" (UID: \"5c4e88aa-842f-453a-9ce9-8354c16340e9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.940433 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5c4e88aa-842f-453a-9ce9-8354c16340e9-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-499xq\" (UID: \"5c4e88aa-842f-453a-9ce9-8354c16340e9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.940516 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5854r\" (UniqueName: \"kubernetes.io/projected/5c4e88aa-842f-453a-9ce9-8354c16340e9-kube-api-access-5854r\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-499xq\" (UID: \"5c4e88aa-842f-453a-9ce9-8354c16340e9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.940596 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c4e88aa-842f-453a-9ce9-8354c16340e9-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-499xq\" (UID: \"5c4e88aa-842f-453a-9ce9-8354c16340e9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.944926 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5c4e88aa-842f-453a-9ce9-8354c16340e9-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-499xq\" (UID: \"5c4e88aa-842f-453a-9ce9-8354c16340e9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.945123 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c4e88aa-842f-453a-9ce9-8354c16340e9-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-499xq\" (UID: \"5c4e88aa-842f-453a-9ce9-8354c16340e9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.964374 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5854r\" (UniqueName: \"kubernetes.io/projected/5c4e88aa-842f-453a-9ce9-8354c16340e9-kube-api-access-5854r\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-499xq\" (UID: \"5c4e88aa-842f-453a-9ce9-8354c16340e9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq" Feb 17 13:55:51 crc kubenswrapper[4804]: I0217 13:55:51.976057 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq" Feb 17 13:55:52 crc kubenswrapper[4804]: I0217 13:55:52.489542 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq"] Feb 17 13:55:52 crc kubenswrapper[4804]: I0217 13:55:52.537592 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq" event={"ID":"5c4e88aa-842f-453a-9ce9-8354c16340e9","Type":"ContainerStarted","Data":"c1ff2f72248d2caefe219eeeda003dc9466862a80496ba52f9eda2e288c07614"} Feb 17 13:55:52 crc kubenswrapper[4804]: I0217 13:55:52.575190 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:55:52 crc kubenswrapper[4804]: E0217 13:55:52.575552 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:55:52 crc kubenswrapper[4804]: I0217 13:55:52.585055 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c3b824f-ae3d-4681-8b14-16099a2643d5" path="/var/lib/kubelet/pods/6c3b824f-ae3d-4681-8b14-16099a2643d5/volumes" Feb 17 13:55:53 crc kubenswrapper[4804]: I0217 13:55:53.547363 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq" event={"ID":"5c4e88aa-842f-453a-9ce9-8354c16340e9","Type":"ContainerStarted","Data":"8502d67bffeaa50b847ab11945f925a93eebe7c2984cfa9357e9a9dabde733e2"} Feb 17 13:55:53 crc kubenswrapper[4804]: I0217 13:55:53.563703 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq" podStartSLOduration=2.3661311400000002 podStartE2EDuration="2.56368516s" podCreationTimestamp="2026-02-17 13:55:51 +0000 UTC" firstStartedPulling="2026-02-17 13:55:52.496998982 +0000 UTC m=+1826.608418319" lastFinishedPulling="2026-02-17 13:55:52.694553002 +0000 UTC m=+1826.805972339" observedRunningTime="2026-02-17 13:55:53.562559936 +0000 UTC m=+1827.673979273" watchObservedRunningTime="2026-02-17 13:55:53.56368516 +0000 UTC m=+1827.675104497" Feb 17 13:56:03 crc kubenswrapper[4804]: I0217 13:56:03.041575 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lvbbn"] Feb 17 13:56:03 crc kubenswrapper[4804]: I0217 13:56:03.046937 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lvbbn" Feb 17 13:56:03 crc kubenswrapper[4804]: I0217 13:56:03.058476 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lvbbn"] Feb 17 13:56:03 crc kubenswrapper[4804]: I0217 13:56:03.158129 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59301759-1bac-4d09-97be-b829e799b4d8-catalog-content\") pod \"redhat-operators-lvbbn\" (UID: \"59301759-1bac-4d09-97be-b829e799b4d8\") " pod="openshift-marketplace/redhat-operators-lvbbn" Feb 17 13:56:03 crc kubenswrapper[4804]: I0217 13:56:03.158189 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w97tz\" (UniqueName: \"kubernetes.io/projected/59301759-1bac-4d09-97be-b829e799b4d8-kube-api-access-w97tz\") pod \"redhat-operators-lvbbn\" (UID: \"59301759-1bac-4d09-97be-b829e799b4d8\") " pod="openshift-marketplace/redhat-operators-lvbbn" Feb 17 13:56:03 crc kubenswrapper[4804]: I0217 13:56:03.158252 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59301759-1bac-4d09-97be-b829e799b4d8-utilities\") pod \"redhat-operators-lvbbn\" (UID: \"59301759-1bac-4d09-97be-b829e799b4d8\") " pod="openshift-marketplace/redhat-operators-lvbbn" Feb 17 13:56:03 crc kubenswrapper[4804]: I0217 13:56:03.260233 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59301759-1bac-4d09-97be-b829e799b4d8-catalog-content\") pod \"redhat-operators-lvbbn\" (UID: \"59301759-1bac-4d09-97be-b829e799b4d8\") " pod="openshift-marketplace/redhat-operators-lvbbn" Feb 17 13:56:03 crc kubenswrapper[4804]: I0217 13:56:03.260533 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w97tz\" (UniqueName: \"kubernetes.io/projected/59301759-1bac-4d09-97be-b829e799b4d8-kube-api-access-w97tz\") pod \"redhat-operators-lvbbn\" (UID: \"59301759-1bac-4d09-97be-b829e799b4d8\") " pod="openshift-marketplace/redhat-operators-lvbbn" Feb 17 13:56:03 crc kubenswrapper[4804]: I0217 13:56:03.260703 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59301759-1bac-4d09-97be-b829e799b4d8-utilities\") pod \"redhat-operators-lvbbn\" (UID: \"59301759-1bac-4d09-97be-b829e799b4d8\") " pod="openshift-marketplace/redhat-operators-lvbbn" Feb 17 13:56:03 crc kubenswrapper[4804]: I0217 13:56:03.260735 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59301759-1bac-4d09-97be-b829e799b4d8-catalog-content\") pod \"redhat-operators-lvbbn\" (UID: \"59301759-1bac-4d09-97be-b829e799b4d8\") " pod="openshift-marketplace/redhat-operators-lvbbn" Feb 17 13:56:03 crc kubenswrapper[4804]: I0217 13:56:03.260949 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59301759-1bac-4d09-97be-b829e799b4d8-utilities\") pod \"redhat-operators-lvbbn\" (UID: \"59301759-1bac-4d09-97be-b829e799b4d8\") " pod="openshift-marketplace/redhat-operators-lvbbn" Feb 17 13:56:03 crc kubenswrapper[4804]: I0217 13:56:03.280936 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w97tz\" (UniqueName: \"kubernetes.io/projected/59301759-1bac-4d09-97be-b829e799b4d8-kube-api-access-w97tz\") pod \"redhat-operators-lvbbn\" (UID: \"59301759-1bac-4d09-97be-b829e799b4d8\") " pod="openshift-marketplace/redhat-operators-lvbbn" Feb 17 13:56:03 crc kubenswrapper[4804]: I0217 13:56:03.373721 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lvbbn" Feb 17 13:56:03 crc kubenswrapper[4804]: I0217 13:56:03.573660 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:56:03 crc kubenswrapper[4804]: E0217 13:56:03.574174 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:56:03 crc kubenswrapper[4804]: I0217 13:56:03.848575 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lvbbn"] Feb 17 13:56:04 crc kubenswrapper[4804]: I0217 13:56:04.650262 4804 generic.go:334] "Generic (PLEG): container finished" podID="59301759-1bac-4d09-97be-b829e799b4d8" containerID="3c06e138999c99a4ab47410291000b7b473db950ccf9dfc1efe15e8c156155f4" exitCode=0 Feb 17 13:56:04 crc kubenswrapper[4804]: I0217 13:56:04.650350 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lvbbn" event={"ID":"59301759-1bac-4d09-97be-b829e799b4d8","Type":"ContainerDied","Data":"3c06e138999c99a4ab47410291000b7b473db950ccf9dfc1efe15e8c156155f4"} Feb 17 13:56:04 crc kubenswrapper[4804]: I0217 13:56:04.650540 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lvbbn" event={"ID":"59301759-1bac-4d09-97be-b829e799b4d8","Type":"ContainerStarted","Data":"fdf24c14126994bd4aa4f0024928980d88141f189583e3981f58728c0a0db1c4"} Feb 17 13:56:04 crc kubenswrapper[4804]: I0217 13:56:04.653228 4804 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 13:56:08 crc kubenswrapper[4804]: I0217 13:56:08.040990 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-lpd9f"] Feb 17 13:56:08 crc kubenswrapper[4804]: I0217 13:56:08.052135 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-lpd9f"] Feb 17 13:56:08 crc kubenswrapper[4804]: I0217 13:56:08.589576 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfb6c8ec-f280-4566-bb37-b286119956b5" path="/var/lib/kubelet/pods/dfb6c8ec-f280-4566-bb37-b286119956b5/volumes" Feb 17 13:56:11 crc kubenswrapper[4804]: I0217 13:56:11.718339 4804 generic.go:334] "Generic (PLEG): container finished" podID="59301759-1bac-4d09-97be-b829e799b4d8" containerID="b3dfdc9369ed3a2971fc0782de468592b129b07a3e522a6a0c2e13ee15dd3f87" exitCode=0 Feb 17 13:56:11 crc kubenswrapper[4804]: I0217 13:56:11.718464 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lvbbn" event={"ID":"59301759-1bac-4d09-97be-b829e799b4d8","Type":"ContainerDied","Data":"b3dfdc9369ed3a2971fc0782de468592b129b07a3e522a6a0c2e13ee15dd3f87"} Feb 17 13:56:12 crc kubenswrapper[4804]: I0217 13:56:12.730950 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lvbbn" event={"ID":"59301759-1bac-4d09-97be-b829e799b4d8","Type":"ContainerStarted","Data":"c805afcabf9ea42a5e24cd95a24be10fccfa5b09e41dbf935bfd92193826802e"} Feb 17 13:56:12 crc kubenswrapper[4804]: I0217 13:56:12.764885 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lvbbn" podStartSLOduration=2.212643188 podStartE2EDuration="9.764867609s" podCreationTimestamp="2026-02-17 13:56:03 +0000 UTC" firstStartedPulling="2026-02-17 13:56:04.652968601 +0000 UTC m=+1838.764387938" lastFinishedPulling="2026-02-17 13:56:12.205193032 +0000 UTC m=+1846.316612359" observedRunningTime="2026-02-17 13:56:12.756632806 +0000 UTC m=+1846.868052143" watchObservedRunningTime="2026-02-17 13:56:12.764867609 +0000 UTC m=+1846.876286946" Feb 17 13:56:13 crc kubenswrapper[4804]: I0217 13:56:13.374658 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lvbbn" Feb 17 13:56:13 crc kubenswrapper[4804]: I0217 13:56:13.375032 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lvbbn" Feb 17 13:56:14 crc kubenswrapper[4804]: I0217 13:56:14.430821 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lvbbn" podUID="59301759-1bac-4d09-97be-b829e799b4d8" containerName="registry-server" probeResult="failure" output=< Feb 17 13:56:14 crc kubenswrapper[4804]: timeout: failed to connect service ":50051" within 1s Feb 17 13:56:14 crc kubenswrapper[4804]: > Feb 17 13:56:15 crc kubenswrapper[4804]: I0217 13:56:15.025527 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-ncwmc"] Feb 17 13:56:15 crc kubenswrapper[4804]: I0217 13:56:15.034492 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-ncwmc"] Feb 17 13:56:16 crc kubenswrapper[4804]: I0217 13:56:16.591864 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4895769c-ef45-40c8-a8ae-0c5cb954dab2" path="/var/lib/kubelet/pods/4895769c-ef45-40c8-a8ae-0c5cb954dab2/volumes" Feb 17 13:56:17 crc kubenswrapper[4804]: I0217 13:56:17.574372 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:56:17 crc kubenswrapper[4804]: E0217 13:56:17.574674 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:56:19 crc kubenswrapper[4804]: I0217 13:56:19.033592 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-7982-account-create-update-pd5b7"] Feb 17 13:56:19 crc kubenswrapper[4804]: I0217 13:56:19.042499 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-d59c-account-create-update-phgft"] Feb 17 13:56:19 crc kubenswrapper[4804]: I0217 13:56:19.053346 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-hdmw5"] Feb 17 13:56:19 crc kubenswrapper[4804]: I0217 13:56:19.061881 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-98b2-account-create-update-648xj"] Feb 17 13:56:19 crc kubenswrapper[4804]: I0217 13:56:19.069543 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-46zbc"] Feb 17 13:56:19 crc kubenswrapper[4804]: I0217 13:56:19.077712 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-hdmw5"] Feb 17 13:56:19 crc kubenswrapper[4804]: I0217 13:56:19.084791 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-d59c-account-create-update-phgft"] Feb 17 13:56:19 crc kubenswrapper[4804]: I0217 13:56:19.091328 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-7982-account-create-update-pd5b7"] Feb 17 13:56:19 crc kubenswrapper[4804]: I0217 13:56:19.098506 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-98b2-account-create-update-648xj"] Feb 17 13:56:19 crc kubenswrapper[4804]: I0217 13:56:19.105360 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-46zbc"] Feb 17 13:56:20 crc kubenswrapper[4804]: I0217 13:56:20.591013 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26fadc7a-6cf8-4ea0-8609-50e585db4115" path="/var/lib/kubelet/pods/26fadc7a-6cf8-4ea0-8609-50e585db4115/volumes" Feb 17 13:56:20 crc kubenswrapper[4804]: I0217 13:56:20.591841 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35f1cc1f-a736-4c02-9c26-726c0c6f0d59" path="/var/lib/kubelet/pods/35f1cc1f-a736-4c02-9c26-726c0c6f0d59/volumes" Feb 17 13:56:20 crc kubenswrapper[4804]: I0217 13:56:20.592591 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60ee8426-dcbf-4430-8594-68ee778a8bbc" path="/var/lib/kubelet/pods/60ee8426-dcbf-4430-8594-68ee778a8bbc/volumes" Feb 17 13:56:20 crc kubenswrapper[4804]: I0217 13:56:20.593390 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e26c9257-7102-4d48-8999-c0a3f0ca4009" path="/var/lib/kubelet/pods/e26c9257-7102-4d48-8999-c0a3f0ca4009/volumes" Feb 17 13:56:20 crc kubenswrapper[4804]: I0217 13:56:20.594989 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e64978ab-e30e-4ebf-bce0-a8e29d5e5adc" path="/var/lib/kubelet/pods/e64978ab-e30e-4ebf-bce0-a8e29d5e5adc/volumes" Feb 17 13:56:23 crc kubenswrapper[4804]: I0217 13:56:23.433274 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lvbbn" Feb 17 13:56:23 crc kubenswrapper[4804]: I0217 13:56:23.501661 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lvbbn" Feb 17 13:56:23 crc kubenswrapper[4804]: I0217 13:56:23.683091 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lvbbn"] Feb 17 13:56:24 crc kubenswrapper[4804]: I0217 13:56:24.030091 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-dgzbs"] Feb 17 13:56:24 crc kubenswrapper[4804]: I0217 13:56:24.039681 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-dgzbs"] Feb 17 13:56:24 crc kubenswrapper[4804]: I0217 13:56:24.588676 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd9036c7-1cff-4fb8-9af2-90057c4251dc" path="/var/lib/kubelet/pods/fd9036c7-1cff-4fb8-9af2-90057c4251dc/volumes" Feb 17 13:56:24 crc kubenswrapper[4804]: I0217 13:56:24.853188 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lvbbn" podUID="59301759-1bac-4d09-97be-b829e799b4d8" containerName="registry-server" containerID="cri-o://c805afcabf9ea42a5e24cd95a24be10fccfa5b09e41dbf935bfd92193826802e" gracePeriod=2 Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.301087 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lvbbn" Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.385525 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59301759-1bac-4d09-97be-b829e799b4d8-utilities\") pod \"59301759-1bac-4d09-97be-b829e799b4d8\" (UID: \"59301759-1bac-4d09-97be-b829e799b4d8\") " Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.385599 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w97tz\" (UniqueName: \"kubernetes.io/projected/59301759-1bac-4d09-97be-b829e799b4d8-kube-api-access-w97tz\") pod \"59301759-1bac-4d09-97be-b829e799b4d8\" (UID: \"59301759-1bac-4d09-97be-b829e799b4d8\") " Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.385801 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59301759-1bac-4d09-97be-b829e799b4d8-catalog-content\") pod \"59301759-1bac-4d09-97be-b829e799b4d8\" (UID: \"59301759-1bac-4d09-97be-b829e799b4d8\") " Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.389229 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59301759-1bac-4d09-97be-b829e799b4d8-utilities" (OuterVolumeSpecName: "utilities") pod "59301759-1bac-4d09-97be-b829e799b4d8" (UID: "59301759-1bac-4d09-97be-b829e799b4d8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.393306 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59301759-1bac-4d09-97be-b829e799b4d8-kube-api-access-w97tz" (OuterVolumeSpecName: "kube-api-access-w97tz") pod "59301759-1bac-4d09-97be-b829e799b4d8" (UID: "59301759-1bac-4d09-97be-b829e799b4d8"). InnerVolumeSpecName "kube-api-access-w97tz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.487800 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59301759-1bac-4d09-97be-b829e799b4d8-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.487841 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w97tz\" (UniqueName: \"kubernetes.io/projected/59301759-1bac-4d09-97be-b829e799b4d8-kube-api-access-w97tz\") on node \"crc\" DevicePath \"\"" Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.526501 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59301759-1bac-4d09-97be-b829e799b4d8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "59301759-1bac-4d09-97be-b829e799b4d8" (UID: "59301759-1bac-4d09-97be-b829e799b4d8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.593392 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59301759-1bac-4d09-97be-b829e799b4d8-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.865292 4804 generic.go:334] "Generic (PLEG): container finished" podID="59301759-1bac-4d09-97be-b829e799b4d8" containerID="c805afcabf9ea42a5e24cd95a24be10fccfa5b09e41dbf935bfd92193826802e" exitCode=0 Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.865343 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lvbbn" event={"ID":"59301759-1bac-4d09-97be-b829e799b4d8","Type":"ContainerDied","Data":"c805afcabf9ea42a5e24cd95a24be10fccfa5b09e41dbf935bfd92193826802e"} Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.865404 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lvbbn" event={"ID":"59301759-1bac-4d09-97be-b829e799b4d8","Type":"ContainerDied","Data":"fdf24c14126994bd4aa4f0024928980d88141f189583e3981f58728c0a0db1c4"} Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.865421 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lvbbn" Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.865435 4804 scope.go:117] "RemoveContainer" containerID="c805afcabf9ea42a5e24cd95a24be10fccfa5b09e41dbf935bfd92193826802e" Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.917176 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lvbbn"] Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.917315 4804 scope.go:117] "RemoveContainer" containerID="b3dfdc9369ed3a2971fc0782de468592b129b07a3e522a6a0c2e13ee15dd3f87" Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.934770 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lvbbn"] Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.954662 4804 scope.go:117] "RemoveContainer" containerID="3c06e138999c99a4ab47410291000b7b473db950ccf9dfc1efe15e8c156155f4" Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.984739 4804 scope.go:117] "RemoveContainer" containerID="c805afcabf9ea42a5e24cd95a24be10fccfa5b09e41dbf935bfd92193826802e" Feb 17 13:56:25 crc kubenswrapper[4804]: E0217 13:56:25.985309 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c805afcabf9ea42a5e24cd95a24be10fccfa5b09e41dbf935bfd92193826802e\": container with ID starting with c805afcabf9ea42a5e24cd95a24be10fccfa5b09e41dbf935bfd92193826802e not found: ID does not exist" containerID="c805afcabf9ea42a5e24cd95a24be10fccfa5b09e41dbf935bfd92193826802e" Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.985363 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c805afcabf9ea42a5e24cd95a24be10fccfa5b09e41dbf935bfd92193826802e"} err="failed to get container status \"c805afcabf9ea42a5e24cd95a24be10fccfa5b09e41dbf935bfd92193826802e\": rpc error: code = NotFound desc = could not find container \"c805afcabf9ea42a5e24cd95a24be10fccfa5b09e41dbf935bfd92193826802e\": container with ID starting with c805afcabf9ea42a5e24cd95a24be10fccfa5b09e41dbf935bfd92193826802e not found: ID does not exist" Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.985396 4804 scope.go:117] "RemoveContainer" containerID="b3dfdc9369ed3a2971fc0782de468592b129b07a3e522a6a0c2e13ee15dd3f87" Feb 17 13:56:25 crc kubenswrapper[4804]: E0217 13:56:25.985935 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3dfdc9369ed3a2971fc0782de468592b129b07a3e522a6a0c2e13ee15dd3f87\": container with ID starting with b3dfdc9369ed3a2971fc0782de468592b129b07a3e522a6a0c2e13ee15dd3f87 not found: ID does not exist" containerID="b3dfdc9369ed3a2971fc0782de468592b129b07a3e522a6a0c2e13ee15dd3f87" Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.985977 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3dfdc9369ed3a2971fc0782de468592b129b07a3e522a6a0c2e13ee15dd3f87"} err="failed to get container status \"b3dfdc9369ed3a2971fc0782de468592b129b07a3e522a6a0c2e13ee15dd3f87\": rpc error: code = NotFound desc = could not find container \"b3dfdc9369ed3a2971fc0782de468592b129b07a3e522a6a0c2e13ee15dd3f87\": container with ID starting with b3dfdc9369ed3a2971fc0782de468592b129b07a3e522a6a0c2e13ee15dd3f87 not found: ID does not exist" Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.986022 4804 scope.go:117] "RemoveContainer" containerID="3c06e138999c99a4ab47410291000b7b473db950ccf9dfc1efe15e8c156155f4" Feb 17 13:56:25 crc kubenswrapper[4804]: E0217 13:56:25.986471 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c06e138999c99a4ab47410291000b7b473db950ccf9dfc1efe15e8c156155f4\": container with ID starting with 3c06e138999c99a4ab47410291000b7b473db950ccf9dfc1efe15e8c156155f4 not found: ID does not exist" containerID="3c06e138999c99a4ab47410291000b7b473db950ccf9dfc1efe15e8c156155f4" Feb 17 13:56:25 crc kubenswrapper[4804]: I0217 13:56:25.986499 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c06e138999c99a4ab47410291000b7b473db950ccf9dfc1efe15e8c156155f4"} err="failed to get container status \"3c06e138999c99a4ab47410291000b7b473db950ccf9dfc1efe15e8c156155f4\": rpc error: code = NotFound desc = could not find container \"3c06e138999c99a4ab47410291000b7b473db950ccf9dfc1efe15e8c156155f4\": container with ID starting with 3c06e138999c99a4ab47410291000b7b473db950ccf9dfc1efe15e8c156155f4 not found: ID does not exist" Feb 17 13:56:26 crc kubenswrapper[4804]: I0217 13:56:26.588687 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59301759-1bac-4d09-97be-b829e799b4d8" path="/var/lib/kubelet/pods/59301759-1bac-4d09-97be-b829e799b4d8/volumes" Feb 17 13:56:29 crc kubenswrapper[4804]: I0217 13:56:29.483520 4804 scope.go:117] "RemoveContainer" containerID="523c2a0dce1e6efc07d04ec334853ccdc0d1e041c66ee6b003b630197674d70f" Feb 17 13:56:29 crc kubenswrapper[4804]: I0217 13:56:29.506742 4804 scope.go:117] "RemoveContainer" containerID="9b6aded40ee8715e414f7eaa0e4d2635fac772bb7db34b9cafa3737130656836" Feb 17 13:56:29 crc kubenswrapper[4804]: I0217 13:56:29.556513 4804 scope.go:117] "RemoveContainer" containerID="9bdfcabbaf1ee1e250875698a377ab6bde8ce671649b12731771caa70ec454c1" Feb 17 13:56:29 crc kubenswrapper[4804]: I0217 13:56:29.607842 4804 scope.go:117] "RemoveContainer" containerID="95b7f32cb6985d65e04882b6a57442ea7ebdd3da00100304c3a217e8d0730df3" Feb 17 13:56:29 crc kubenswrapper[4804]: I0217 13:56:29.664927 4804 scope.go:117] "RemoveContainer" containerID="f94b862fb364184a245162162ecd4a81dc390800d6adb7360015eea9da137ebf" Feb 17 13:56:29 crc kubenswrapper[4804]: I0217 13:56:29.704003 4804 scope.go:117] "RemoveContainer" containerID="df5f178d05ce64eb60f91663ba876543b059e11efed3814a687a5cde6c71f197" Feb 17 13:56:29 crc kubenswrapper[4804]: I0217 13:56:29.746217 4804 scope.go:117] "RemoveContainer" containerID="195eb227b4e35d11d8a48fcc419fb067302eb3196988b8e72eeeeeb8aa5a6e2e" Feb 17 13:56:29 crc kubenswrapper[4804]: I0217 13:56:29.767367 4804 scope.go:117] "RemoveContainer" containerID="a042fc58bb60ee18221f1218414ff109d197e288fe316a76abf5d21b41df0c21" Feb 17 13:56:29 crc kubenswrapper[4804]: I0217 13:56:29.785791 4804 scope.go:117] "RemoveContainer" containerID="ed7f04a5bf7a47131ede3cac958534ed66f33e1ae426c629f9157f389db06cde" Feb 17 13:56:29 crc kubenswrapper[4804]: I0217 13:56:29.811337 4804 scope.go:117] "RemoveContainer" containerID="5707e03ce1413559d6e451944a8178ed7c1374503c523227f07af12a0d1deda1" Feb 17 13:56:29 crc kubenswrapper[4804]: I0217 13:56:29.833807 4804 scope.go:117] "RemoveContainer" containerID="e16d35978c1a93f38aec046090d4bb89a7fa37eda37be7158b82151bac67e327" Feb 17 13:56:29 crc kubenswrapper[4804]: I0217 13:56:29.850679 4804 scope.go:117] "RemoveContainer" containerID="2af0e585925ef4ba3eb4997ba9a346fe72a20fb7f9f2943dcb04719e80a69278" Feb 17 13:56:29 crc kubenswrapper[4804]: I0217 13:56:29.871522 4804 scope.go:117] "RemoveContainer" containerID="4a8cd13cbb3ba23bfa180f42dc167734c03b2d4bcdf0842db5532816b1f0b9bd" Feb 17 13:56:29 crc kubenswrapper[4804]: I0217 13:56:29.893509 4804 scope.go:117] "RemoveContainer" containerID="7dbf5f5d88a50f9cfadbbf6692ca887131d2b4df1c33d00e1f7267394ff4525b" Feb 17 13:56:29 crc kubenswrapper[4804]: I0217 13:56:29.939560 4804 scope.go:117] "RemoveContainer" containerID="e84b0f31988f4caf559aaf77b9c196ea5e660cca5bf9a529065d3d4f3f6186e1" Feb 17 13:56:30 crc kubenswrapper[4804]: I0217 13:56:30.578193 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:56:30 crc kubenswrapper[4804]: E0217 13:56:30.578800 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:56:42 crc kubenswrapper[4804]: I0217 13:56:42.573689 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:56:42 crc kubenswrapper[4804]: E0217 13:56:42.574417 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:56:54 crc kubenswrapper[4804]: I0217 13:56:54.575599 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:56:54 crc kubenswrapper[4804]: E0217 13:56:54.576920 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:56:56 crc kubenswrapper[4804]: I0217 13:56:56.046164 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-jltn7"] Feb 17 13:56:56 crc kubenswrapper[4804]: I0217 13:56:56.058269 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-jltn7"] Feb 17 13:56:56 crc kubenswrapper[4804]: I0217 13:56:56.221142 4804 generic.go:334] "Generic (PLEG): container finished" podID="5c4e88aa-842f-453a-9ce9-8354c16340e9" containerID="8502d67bffeaa50b847ab11945f925a93eebe7c2984cfa9357e9a9dabde733e2" exitCode=0 Feb 17 13:56:56 crc kubenswrapper[4804]: I0217 13:56:56.221194 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq" event={"ID":"5c4e88aa-842f-453a-9ce9-8354c16340e9","Type":"ContainerDied","Data":"8502d67bffeaa50b847ab11945f925a93eebe7c2984cfa9357e9a9dabde733e2"} Feb 17 13:56:56 crc kubenswrapper[4804]: I0217 13:56:56.584853 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f15102ce-82ca-49c8-a069-25469380b043" path="/var/lib/kubelet/pods/f15102ce-82ca-49c8-a069-25469380b043/volumes" Feb 17 13:56:57 crc kubenswrapper[4804]: I0217 13:56:57.627338 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq" Feb 17 13:56:57 crc kubenswrapper[4804]: I0217 13:56:57.758309 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5c4e88aa-842f-453a-9ce9-8354c16340e9-ssh-key-openstack-edpm-ipam\") pod \"5c4e88aa-842f-453a-9ce9-8354c16340e9\" (UID: \"5c4e88aa-842f-453a-9ce9-8354c16340e9\") " Feb 17 13:56:57 crc kubenswrapper[4804]: I0217 13:56:57.758352 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5854r\" (UniqueName: \"kubernetes.io/projected/5c4e88aa-842f-453a-9ce9-8354c16340e9-kube-api-access-5854r\") pod \"5c4e88aa-842f-453a-9ce9-8354c16340e9\" (UID: \"5c4e88aa-842f-453a-9ce9-8354c16340e9\") " Feb 17 13:56:57 crc kubenswrapper[4804]: I0217 13:56:57.758504 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c4e88aa-842f-453a-9ce9-8354c16340e9-inventory\") pod \"5c4e88aa-842f-453a-9ce9-8354c16340e9\" (UID: \"5c4e88aa-842f-453a-9ce9-8354c16340e9\") " Feb 17 13:56:57 crc kubenswrapper[4804]: I0217 13:56:57.767416 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c4e88aa-842f-453a-9ce9-8354c16340e9-kube-api-access-5854r" (OuterVolumeSpecName: "kube-api-access-5854r") pod "5c4e88aa-842f-453a-9ce9-8354c16340e9" (UID: "5c4e88aa-842f-453a-9ce9-8354c16340e9"). InnerVolumeSpecName "kube-api-access-5854r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:56:57 crc kubenswrapper[4804]: I0217 13:56:57.784173 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c4e88aa-842f-453a-9ce9-8354c16340e9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5c4e88aa-842f-453a-9ce9-8354c16340e9" (UID: "5c4e88aa-842f-453a-9ce9-8354c16340e9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:56:57 crc kubenswrapper[4804]: I0217 13:56:57.808265 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c4e88aa-842f-453a-9ce9-8354c16340e9-inventory" (OuterVolumeSpecName: "inventory") pod "5c4e88aa-842f-453a-9ce9-8354c16340e9" (UID: "5c4e88aa-842f-453a-9ce9-8354c16340e9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:56:57 crc kubenswrapper[4804]: I0217 13:56:57.863764 4804 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c4e88aa-842f-453a-9ce9-8354c16340e9-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 13:56:57 crc kubenswrapper[4804]: I0217 13:56:57.863816 4804 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5c4e88aa-842f-453a-9ce9-8354c16340e9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 13:56:57 crc kubenswrapper[4804]: I0217 13:56:57.863831 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5854r\" (UniqueName: \"kubernetes.io/projected/5c4e88aa-842f-453a-9ce9-8354c16340e9-kube-api-access-5854r\") on node \"crc\" DevicePath \"\"" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.241883 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq" event={"ID":"5c4e88aa-842f-453a-9ce9-8354c16340e9","Type":"ContainerDied","Data":"c1ff2f72248d2caefe219eeeda003dc9466862a80496ba52f9eda2e288c07614"} Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.242136 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1ff2f72248d2caefe219eeeda003dc9466862a80496ba52f9eda2e288c07614" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.241965 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-499xq" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.323547 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb"] Feb 17 13:56:58 crc kubenswrapper[4804]: E0217 13:56:58.323995 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59301759-1bac-4d09-97be-b829e799b4d8" containerName="registry-server" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.324021 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="59301759-1bac-4d09-97be-b829e799b4d8" containerName="registry-server" Feb 17 13:56:58 crc kubenswrapper[4804]: E0217 13:56:58.324043 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c4e88aa-842f-453a-9ce9-8354c16340e9" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.324052 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c4e88aa-842f-453a-9ce9-8354c16340e9" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 17 13:56:58 crc kubenswrapper[4804]: E0217 13:56:58.324076 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59301759-1bac-4d09-97be-b829e799b4d8" containerName="extract-content" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.324084 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="59301759-1bac-4d09-97be-b829e799b4d8" containerName="extract-content" Feb 17 13:56:58 crc kubenswrapper[4804]: E0217 13:56:58.324096 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59301759-1bac-4d09-97be-b829e799b4d8" containerName="extract-utilities" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.324104 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="59301759-1bac-4d09-97be-b829e799b4d8" containerName="extract-utilities" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.324352 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="59301759-1bac-4d09-97be-b829e799b4d8" containerName="registry-server" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.324386 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c4e88aa-842f-453a-9ce9-8354c16340e9" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.325087 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.327920 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.328099 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.328144 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-29gqz" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.329151 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.378810 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed6642bc-b49f-4e17-a721-b3eae09246aa-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb\" (UID: \"ed6642bc-b49f-4e17-a721-b3eae09246aa\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.378884 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwzk8\" (UniqueName: \"kubernetes.io/projected/ed6642bc-b49f-4e17-a721-b3eae09246aa-kube-api-access-gwzk8\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb\" (UID: \"ed6642bc-b49f-4e17-a721-b3eae09246aa\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.378931 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ed6642bc-b49f-4e17-a721-b3eae09246aa-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb\" (UID: \"ed6642bc-b49f-4e17-a721-b3eae09246aa\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.392046 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb"] Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.480556 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed6642bc-b49f-4e17-a721-b3eae09246aa-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb\" (UID: \"ed6642bc-b49f-4e17-a721-b3eae09246aa\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.481171 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwzk8\" (UniqueName: \"kubernetes.io/projected/ed6642bc-b49f-4e17-a721-b3eae09246aa-kube-api-access-gwzk8\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb\" (UID: \"ed6642bc-b49f-4e17-a721-b3eae09246aa\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.481531 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ed6642bc-b49f-4e17-a721-b3eae09246aa-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb\" (UID: \"ed6642bc-b49f-4e17-a721-b3eae09246aa\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.484369 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed6642bc-b49f-4e17-a721-b3eae09246aa-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb\" (UID: \"ed6642bc-b49f-4e17-a721-b3eae09246aa\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.484946 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ed6642bc-b49f-4e17-a721-b3eae09246aa-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb\" (UID: \"ed6642bc-b49f-4e17-a721-b3eae09246aa\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.495919 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwzk8\" (UniqueName: \"kubernetes.io/projected/ed6642bc-b49f-4e17-a721-b3eae09246aa-kube-api-access-gwzk8\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb\" (UID: \"ed6642bc-b49f-4e17-a721-b3eae09246aa\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb" Feb 17 13:56:58 crc kubenswrapper[4804]: I0217 13:56:58.707383 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb" Feb 17 13:56:59 crc kubenswrapper[4804]: I0217 13:56:59.309615 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb"] Feb 17 13:57:00 crc kubenswrapper[4804]: I0217 13:57:00.264448 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb" event={"ID":"ed6642bc-b49f-4e17-a721-b3eae09246aa","Type":"ContainerStarted","Data":"48c424eb76a8a3f09a7eda61042a1fd810e7b2c28cdae1f17671d3c2a494e448"} Feb 17 13:57:00 crc kubenswrapper[4804]: I0217 13:57:00.264797 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb" event={"ID":"ed6642bc-b49f-4e17-a721-b3eae09246aa","Type":"ContainerStarted","Data":"f6bc0e9de5cf01fa79d075997918dc6977588636afdac3a50a3625731e798c42"} Feb 17 13:57:00 crc kubenswrapper[4804]: I0217 13:57:00.292518 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb" podStartSLOduration=2.113400217 podStartE2EDuration="2.29249222s" podCreationTimestamp="2026-02-17 13:56:58 +0000 UTC" firstStartedPulling="2026-02-17 13:56:59.313923654 +0000 UTC m=+1893.425343001" lastFinishedPulling="2026-02-17 13:56:59.493015667 +0000 UTC m=+1893.604435004" observedRunningTime="2026-02-17 13:57:00.285886217 +0000 UTC m=+1894.397305554" watchObservedRunningTime="2026-02-17 13:57:00.29249222 +0000 UTC m=+1894.403911587" Feb 17 13:57:04 crc kubenswrapper[4804]: I0217 13:57:04.303604 4804 generic.go:334] "Generic (PLEG): container finished" podID="ed6642bc-b49f-4e17-a721-b3eae09246aa" containerID="48c424eb76a8a3f09a7eda61042a1fd810e7b2c28cdae1f17671d3c2a494e448" exitCode=0 Feb 17 13:57:04 crc kubenswrapper[4804]: I0217 13:57:04.303670 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb" event={"ID":"ed6642bc-b49f-4e17-a721-b3eae09246aa","Type":"ContainerDied","Data":"48c424eb76a8a3f09a7eda61042a1fd810e7b2c28cdae1f17671d3c2a494e448"} Feb 17 13:57:05 crc kubenswrapper[4804]: I0217 13:57:05.053613 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-7kgzk"] Feb 17 13:57:05 crc kubenswrapper[4804]: I0217 13:57:05.071246 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-xf9m6"] Feb 17 13:57:05 crc kubenswrapper[4804]: I0217 13:57:05.081620 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-xf9m6"] Feb 17 13:57:05 crc kubenswrapper[4804]: I0217 13:57:05.089085 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-7kgzk"] Feb 17 13:57:05 crc kubenswrapper[4804]: I0217 13:57:05.738653 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb" Feb 17 13:57:05 crc kubenswrapper[4804]: I0217 13:57:05.832814 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ed6642bc-b49f-4e17-a721-b3eae09246aa-ssh-key-openstack-edpm-ipam\") pod \"ed6642bc-b49f-4e17-a721-b3eae09246aa\" (UID: \"ed6642bc-b49f-4e17-a721-b3eae09246aa\") " Feb 17 13:57:05 crc kubenswrapper[4804]: I0217 13:57:05.833178 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwzk8\" (UniqueName: \"kubernetes.io/projected/ed6642bc-b49f-4e17-a721-b3eae09246aa-kube-api-access-gwzk8\") pod \"ed6642bc-b49f-4e17-a721-b3eae09246aa\" (UID: \"ed6642bc-b49f-4e17-a721-b3eae09246aa\") " Feb 17 13:57:05 crc kubenswrapper[4804]: I0217 13:57:05.833501 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed6642bc-b49f-4e17-a721-b3eae09246aa-inventory\") pod \"ed6642bc-b49f-4e17-a721-b3eae09246aa\" (UID: \"ed6642bc-b49f-4e17-a721-b3eae09246aa\") " Feb 17 13:57:05 crc kubenswrapper[4804]: I0217 13:57:05.838474 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed6642bc-b49f-4e17-a721-b3eae09246aa-kube-api-access-gwzk8" (OuterVolumeSpecName: "kube-api-access-gwzk8") pod "ed6642bc-b49f-4e17-a721-b3eae09246aa" (UID: "ed6642bc-b49f-4e17-a721-b3eae09246aa"). InnerVolumeSpecName "kube-api-access-gwzk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:57:05 crc kubenswrapper[4804]: I0217 13:57:05.857769 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed6642bc-b49f-4e17-a721-b3eae09246aa-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ed6642bc-b49f-4e17-a721-b3eae09246aa" (UID: "ed6642bc-b49f-4e17-a721-b3eae09246aa"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:57:05 crc kubenswrapper[4804]: I0217 13:57:05.860764 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed6642bc-b49f-4e17-a721-b3eae09246aa-inventory" (OuterVolumeSpecName: "inventory") pod "ed6642bc-b49f-4e17-a721-b3eae09246aa" (UID: "ed6642bc-b49f-4e17-a721-b3eae09246aa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:57:05 crc kubenswrapper[4804]: I0217 13:57:05.936381 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwzk8\" (UniqueName: \"kubernetes.io/projected/ed6642bc-b49f-4e17-a721-b3eae09246aa-kube-api-access-gwzk8\") on node \"crc\" DevicePath \"\"" Feb 17 13:57:05 crc kubenswrapper[4804]: I0217 13:57:05.936419 4804 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed6642bc-b49f-4e17-a721-b3eae09246aa-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 13:57:05 crc kubenswrapper[4804]: I0217 13:57:05.936430 4804 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ed6642bc-b49f-4e17-a721-b3eae09246aa-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.024479 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-jz9x9"] Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.032062 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-jz9x9"] Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.326978 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb" event={"ID":"ed6642bc-b49f-4e17-a721-b3eae09246aa","Type":"ContainerDied","Data":"f6bc0e9de5cf01fa79d075997918dc6977588636afdac3a50a3625731e798c42"} Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.327056 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6bc0e9de5cf01fa79d075997918dc6977588636afdac3a50a3625731e798c42" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.327287 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.406616 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm"] Feb 17 13:57:06 crc kubenswrapper[4804]: E0217 13:57:06.409547 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed6642bc-b49f-4e17-a721-b3eae09246aa" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.409601 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed6642bc-b49f-4e17-a721-b3eae09246aa" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.410177 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed6642bc-b49f-4e17-a721-b3eae09246aa" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.411494 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.413268 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.413960 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.414417 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-29gqz" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.416096 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.425784 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm"] Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.545989 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2tz2\" (UniqueName: \"kubernetes.io/projected/e9b53a85-8a87-4b65-8832-00c4175da541-kube-api-access-g2tz2\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hx4nm\" (UID: \"e9b53a85-8a87-4b65-8832-00c4175da541\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.546104 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9b53a85-8a87-4b65-8832-00c4175da541-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hx4nm\" (UID: \"e9b53a85-8a87-4b65-8832-00c4175da541\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.546367 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9b53a85-8a87-4b65-8832-00c4175da541-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hx4nm\" (UID: \"e9b53a85-8a87-4b65-8832-00c4175da541\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.587561 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14e1fc7b-0e6c-4377-b4e0-74e77e951b0d" path="/var/lib/kubelet/pods/14e1fc7b-0e6c-4377-b4e0-74e77e951b0d/volumes" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.588341 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19dd0c13-b898-4147-ae5f-cbc5d4915910" path="/var/lib/kubelet/pods/19dd0c13-b898-4147-ae5f-cbc5d4915910/volumes" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.589296 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96609ec5-c9e0-4611-85ff-f7dc474d889a" path="/var/lib/kubelet/pods/96609ec5-c9e0-4611-85ff-f7dc474d889a/volumes" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.649009 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9b53a85-8a87-4b65-8832-00c4175da541-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hx4nm\" (UID: \"e9b53a85-8a87-4b65-8832-00c4175da541\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.650118 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2tz2\" (UniqueName: \"kubernetes.io/projected/e9b53a85-8a87-4b65-8832-00c4175da541-kube-api-access-g2tz2\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hx4nm\" (UID: \"e9b53a85-8a87-4b65-8832-00c4175da541\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.650266 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9b53a85-8a87-4b65-8832-00c4175da541-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hx4nm\" (UID: \"e9b53a85-8a87-4b65-8832-00c4175da541\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.653121 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9b53a85-8a87-4b65-8832-00c4175da541-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hx4nm\" (UID: \"e9b53a85-8a87-4b65-8832-00c4175da541\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.654959 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9b53a85-8a87-4b65-8832-00c4175da541-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hx4nm\" (UID: \"e9b53a85-8a87-4b65-8832-00c4175da541\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.668592 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2tz2\" (UniqueName: \"kubernetes.io/projected/e9b53a85-8a87-4b65-8832-00c4175da541-kube-api-access-g2tz2\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hx4nm\" (UID: \"e9b53a85-8a87-4b65-8832-00c4175da541\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm" Feb 17 13:57:06 crc kubenswrapper[4804]: I0217 13:57:06.743964 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm" Feb 17 13:57:07 crc kubenswrapper[4804]: I0217 13:57:07.270543 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm"] Feb 17 13:57:07 crc kubenswrapper[4804]: I0217 13:57:07.337863 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm" event={"ID":"e9b53a85-8a87-4b65-8832-00c4175da541","Type":"ContainerStarted","Data":"5a54e726599954e8f70bf35fe6823e9ba4cce6b5cfacc29c1da8fa06b495654d"} Feb 17 13:57:08 crc kubenswrapper[4804]: I0217 13:57:08.349850 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm" event={"ID":"e9b53a85-8a87-4b65-8832-00c4175da541","Type":"ContainerStarted","Data":"906fe7976606f8774bf1bfc4aa2db398a2e4a0a71af3d59f4fd4237e7b6c786c"} Feb 17 13:57:08 crc kubenswrapper[4804]: I0217 13:57:08.370999 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm" podStartSLOduration=2.195539309 podStartE2EDuration="2.37097786s" podCreationTimestamp="2026-02-17 13:57:06 +0000 UTC" firstStartedPulling="2026-02-17 13:57:07.283843105 +0000 UTC m=+1901.395262442" lastFinishedPulling="2026-02-17 13:57:07.459281656 +0000 UTC m=+1901.570700993" observedRunningTime="2026-02-17 13:57:08.364902453 +0000 UTC m=+1902.476321810" watchObservedRunningTime="2026-02-17 13:57:08.37097786 +0000 UTC m=+1902.482397207" Feb 17 13:57:08 crc kubenswrapper[4804]: I0217 13:57:08.574802 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:57:08 crc kubenswrapper[4804]: E0217 13:57:08.575118 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:57:21 crc kubenswrapper[4804]: I0217 13:57:21.032118 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-f9zkj"] Feb 17 13:57:21 crc kubenswrapper[4804]: I0217 13:57:21.039983 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-f9zkj"] Feb 17 13:57:22 crc kubenswrapper[4804]: I0217 13:57:22.587566 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02a921c8-6579-451b-beaf-9832cf900668" path="/var/lib/kubelet/pods/02a921c8-6579-451b-beaf-9832cf900668/volumes" Feb 17 13:57:23 crc kubenswrapper[4804]: I0217 13:57:23.575157 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:57:23 crc kubenswrapper[4804]: E0217 13:57:23.576075 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:57:30 crc kubenswrapper[4804]: I0217 13:57:30.238043 4804 scope.go:117] "RemoveContainer" containerID="872cca29de0693ae54523b0b283408b6320b6200ca8ba4e549db427f9a5d561e" Feb 17 13:57:30 crc kubenswrapper[4804]: I0217 13:57:30.276227 4804 scope.go:117] "RemoveContainer" containerID="604b9ee7bde95746f49c889a56552a71b595a4b833acc7e18a46ed3d41181f64" Feb 17 13:57:30 crc kubenswrapper[4804]: I0217 13:57:30.322145 4804 scope.go:117] "RemoveContainer" containerID="ac639ef1a9c58b32b3d0b2c6ada8a7a2aab1ce08a075bd944173f5c820ec7cfc" Feb 17 13:57:30 crc kubenswrapper[4804]: I0217 13:57:30.352725 4804 scope.go:117] "RemoveContainer" containerID="2b0f9e8901b98239ec002ee748081354dc9e4f43d7161d56dae423af6c1770d2" Feb 17 13:57:30 crc kubenswrapper[4804]: I0217 13:57:30.411005 4804 scope.go:117] "RemoveContainer" containerID="4334f8c8c165dce79cf685c7b7ada0d4aa970effa853bf86402b0c64eaa765f2" Feb 17 13:57:37 crc kubenswrapper[4804]: I0217 13:57:37.574096 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:57:37 crc kubenswrapper[4804]: E0217 13:57:37.575200 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:57:41 crc kubenswrapper[4804]: I0217 13:57:41.651525 4804 generic.go:334] "Generic (PLEG): container finished" podID="e9b53a85-8a87-4b65-8832-00c4175da541" containerID="906fe7976606f8774bf1bfc4aa2db398a2e4a0a71af3d59f4fd4237e7b6c786c" exitCode=0 Feb 17 13:57:41 crc kubenswrapper[4804]: I0217 13:57:41.651562 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm" event={"ID":"e9b53a85-8a87-4b65-8832-00c4175da541","Type":"ContainerDied","Data":"906fe7976606f8774bf1bfc4aa2db398a2e4a0a71af3d59f4fd4237e7b6c786c"} Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.083984 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm" Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.186910 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9b53a85-8a87-4b65-8832-00c4175da541-ssh-key-openstack-edpm-ipam\") pod \"e9b53a85-8a87-4b65-8832-00c4175da541\" (UID: \"e9b53a85-8a87-4b65-8832-00c4175da541\") " Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.186996 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9b53a85-8a87-4b65-8832-00c4175da541-inventory\") pod \"e9b53a85-8a87-4b65-8832-00c4175da541\" (UID: \"e9b53a85-8a87-4b65-8832-00c4175da541\") " Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.187057 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2tz2\" (UniqueName: \"kubernetes.io/projected/e9b53a85-8a87-4b65-8832-00c4175da541-kube-api-access-g2tz2\") pod \"e9b53a85-8a87-4b65-8832-00c4175da541\" (UID: \"e9b53a85-8a87-4b65-8832-00c4175da541\") " Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.193003 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9b53a85-8a87-4b65-8832-00c4175da541-kube-api-access-g2tz2" (OuterVolumeSpecName: "kube-api-access-g2tz2") pod "e9b53a85-8a87-4b65-8832-00c4175da541" (UID: "e9b53a85-8a87-4b65-8832-00c4175da541"). InnerVolumeSpecName "kube-api-access-g2tz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:57:43 crc kubenswrapper[4804]: E0217 13:57:43.215639 4804 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9b53a85-8a87-4b65-8832-00c4175da541-ssh-key-openstack-edpm-ipam podName:e9b53a85-8a87-4b65-8832-00c4175da541 nodeName:}" failed. No retries permitted until 2026-02-17 13:57:43.715601579 +0000 UTC m=+1937.827020916 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ssh-key-openstack-edpm-ipam" (UniqueName: "kubernetes.io/secret/e9b53a85-8a87-4b65-8832-00c4175da541-ssh-key-openstack-edpm-ipam") pod "e9b53a85-8a87-4b65-8832-00c4175da541" (UID: "e9b53a85-8a87-4b65-8832-00c4175da541") : error deleting /var/lib/kubelet/pods/e9b53a85-8a87-4b65-8832-00c4175da541/volume-subpaths: remove /var/lib/kubelet/pods/e9b53a85-8a87-4b65-8832-00c4175da541/volume-subpaths: no such file or directory Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.218292 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9b53a85-8a87-4b65-8832-00c4175da541-inventory" (OuterVolumeSpecName: "inventory") pod "e9b53a85-8a87-4b65-8832-00c4175da541" (UID: "e9b53a85-8a87-4b65-8832-00c4175da541"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.289286 4804 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9b53a85-8a87-4b65-8832-00c4175da541-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.289321 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2tz2\" (UniqueName: \"kubernetes.io/projected/e9b53a85-8a87-4b65-8832-00c4175da541-kube-api-access-g2tz2\") on node \"crc\" DevicePath \"\"" Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.671417 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm" event={"ID":"e9b53a85-8a87-4b65-8832-00c4175da541","Type":"ContainerDied","Data":"5a54e726599954e8f70bf35fe6823e9ba4cce6b5cfacc29c1da8fa06b495654d"} Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.671471 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a54e726599954e8f70bf35fe6823e9ba4cce6b5cfacc29c1da8fa06b495654d" Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.671481 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hx4nm" Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.758588 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq"] Feb 17 13:57:43 crc kubenswrapper[4804]: E0217 13:57:43.758969 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9b53a85-8a87-4b65-8832-00c4175da541" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.758988 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9b53a85-8a87-4b65-8832-00c4175da541" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.759185 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9b53a85-8a87-4b65-8832-00c4175da541" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.759795 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq" Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.770396 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq"] Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.809566 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9b53a85-8a87-4b65-8832-00c4175da541-ssh-key-openstack-edpm-ipam\") pod \"e9b53a85-8a87-4b65-8832-00c4175da541\" (UID: \"e9b53a85-8a87-4b65-8832-00c4175da541\") " Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.813666 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9b53a85-8a87-4b65-8832-00c4175da541-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e9b53a85-8a87-4b65-8832-00c4175da541" (UID: "e9b53a85-8a87-4b65-8832-00c4175da541"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.912850 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5ca70007-e938-4bd5-9f2a-66f18b87743a-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq\" (UID: \"5ca70007-e938-4bd5-9f2a-66f18b87743a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq" Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.912925 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfqfs\" (UniqueName: \"kubernetes.io/projected/5ca70007-e938-4bd5-9f2a-66f18b87743a-kube-api-access-pfqfs\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq\" (UID: \"5ca70007-e938-4bd5-9f2a-66f18b87743a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq" Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.913013 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ca70007-e938-4bd5-9f2a-66f18b87743a-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq\" (UID: \"5ca70007-e938-4bd5-9f2a-66f18b87743a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq" Feb 17 13:57:43 crc kubenswrapper[4804]: I0217 13:57:43.913078 4804 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9b53a85-8a87-4b65-8832-00c4175da541-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 13:57:44 crc kubenswrapper[4804]: I0217 13:57:44.014824 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5ca70007-e938-4bd5-9f2a-66f18b87743a-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq\" (UID: \"5ca70007-e938-4bd5-9f2a-66f18b87743a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq" Feb 17 13:57:44 crc kubenswrapper[4804]: I0217 13:57:44.014900 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfqfs\" (UniqueName: \"kubernetes.io/projected/5ca70007-e938-4bd5-9f2a-66f18b87743a-kube-api-access-pfqfs\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq\" (UID: \"5ca70007-e938-4bd5-9f2a-66f18b87743a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq" Feb 17 13:57:44 crc kubenswrapper[4804]: I0217 13:57:44.014961 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ca70007-e938-4bd5-9f2a-66f18b87743a-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq\" (UID: \"5ca70007-e938-4bd5-9f2a-66f18b87743a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq" Feb 17 13:57:44 crc kubenswrapper[4804]: I0217 13:57:44.032098 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ca70007-e938-4bd5-9f2a-66f18b87743a-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq\" (UID: \"5ca70007-e938-4bd5-9f2a-66f18b87743a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq" Feb 17 13:57:44 crc kubenswrapper[4804]: I0217 13:57:44.042068 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5ca70007-e938-4bd5-9f2a-66f18b87743a-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq\" (UID: \"5ca70007-e938-4bd5-9f2a-66f18b87743a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq" Feb 17 13:57:44 crc kubenswrapper[4804]: I0217 13:57:44.050132 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfqfs\" (UniqueName: \"kubernetes.io/projected/5ca70007-e938-4bd5-9f2a-66f18b87743a-kube-api-access-pfqfs\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq\" (UID: \"5ca70007-e938-4bd5-9f2a-66f18b87743a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq" Feb 17 13:57:44 crc kubenswrapper[4804]: I0217 13:57:44.078892 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq" Feb 17 13:57:44 crc kubenswrapper[4804]: I0217 13:57:44.704120 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq"] Feb 17 13:57:45 crc kubenswrapper[4804]: I0217 13:57:45.702449 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq" event={"ID":"5ca70007-e938-4bd5-9f2a-66f18b87743a","Type":"ContainerStarted","Data":"61d3aa7b960b08aa92c348033cb6b61247c79bcc289b0a714c865d0e129fa428"} Feb 17 13:57:45 crc kubenswrapper[4804]: I0217 13:57:45.702829 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq" event={"ID":"5ca70007-e938-4bd5-9f2a-66f18b87743a","Type":"ContainerStarted","Data":"3c10d42299865df6258c432ca9ac58a243094430a1994c7efe5d12fe7c99a226"} Feb 17 13:57:45 crc kubenswrapper[4804]: I0217 13:57:45.725940 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq" podStartSLOduration=2.541051077 podStartE2EDuration="2.725915367s" podCreationTimestamp="2026-02-17 13:57:43 +0000 UTC" firstStartedPulling="2026-02-17 13:57:44.695766689 +0000 UTC m=+1938.807186026" lastFinishedPulling="2026-02-17 13:57:44.880630979 +0000 UTC m=+1938.992050316" observedRunningTime="2026-02-17 13:57:45.71950206 +0000 UTC m=+1939.830921407" watchObservedRunningTime="2026-02-17 13:57:45.725915367 +0000 UTC m=+1939.837334714" Feb 17 13:57:52 crc kubenswrapper[4804]: I0217 13:57:52.574755 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:57:52 crc kubenswrapper[4804]: E0217 13:57:52.575682 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:58:00 crc kubenswrapper[4804]: I0217 13:58:00.043545 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-582lj"] Feb 17 13:58:00 crc kubenswrapper[4804]: I0217 13:58:00.054848 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-nn6tq"] Feb 17 13:58:00 crc kubenswrapper[4804]: I0217 13:58:00.065049 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-582lj"] Feb 17 13:58:00 crc kubenswrapper[4804]: I0217 13:58:00.072617 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-nn6tq"] Feb 17 13:58:00 crc kubenswrapper[4804]: I0217 13:58:00.587145 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1517f905-d980-43be-8583-f1a40170752e" path="/var/lib/kubelet/pods/1517f905-d980-43be-8583-f1a40170752e/volumes" Feb 17 13:58:00 crc kubenswrapper[4804]: I0217 13:58:00.589028 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fa81aac-8f7a-4947-9fbe-c38851b3652e" path="/var/lib/kubelet/pods/5fa81aac-8f7a-4947-9fbe-c38851b3652e/volumes" Feb 17 13:58:01 crc kubenswrapper[4804]: I0217 13:58:01.060883 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-570c-account-create-update-48hmw"] Feb 17 13:58:01 crc kubenswrapper[4804]: I0217 13:58:01.070528 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-6h6dp"] Feb 17 13:58:01 crc kubenswrapper[4804]: I0217 13:58:01.078894 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-6388-account-create-update-skdjv"] Feb 17 13:58:01 crc kubenswrapper[4804]: I0217 13:58:01.087641 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-2eb5-account-create-update-xv5m7"] Feb 17 13:58:01 crc kubenswrapper[4804]: I0217 13:58:01.097655 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-6388-account-create-update-skdjv"] Feb 17 13:58:01 crc kubenswrapper[4804]: I0217 13:58:01.129551 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-570c-account-create-update-48hmw"] Feb 17 13:58:01 crc kubenswrapper[4804]: I0217 13:58:01.144503 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-2eb5-account-create-update-xv5m7"] Feb 17 13:58:01 crc kubenswrapper[4804]: I0217 13:58:01.156133 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-6h6dp"] Feb 17 13:58:02 crc kubenswrapper[4804]: I0217 13:58:02.583773 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d23eb85-73ab-4049-b6be-486640c922e0" path="/var/lib/kubelet/pods/3d23eb85-73ab-4049-b6be-486640c922e0/volumes" Feb 17 13:58:02 crc kubenswrapper[4804]: I0217 13:58:02.584657 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92d9081e-1e94-4244-b66a-34b05bc98f2d" path="/var/lib/kubelet/pods/92d9081e-1e94-4244-b66a-34b05bc98f2d/volumes" Feb 17 13:58:02 crc kubenswrapper[4804]: I0217 13:58:02.585177 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccb316de-cd6e-4f79-9387-81f7a8add771" path="/var/lib/kubelet/pods/ccb316de-cd6e-4f79-9387-81f7a8add771/volumes" Feb 17 13:58:02 crc kubenswrapper[4804]: I0217 13:58:02.586030 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3c65a30-a890-4d85-80ca-93f9420d5aa4" path="/var/lib/kubelet/pods/f3c65a30-a890-4d85-80ca-93f9420d5aa4/volumes" Feb 17 13:58:03 crc kubenswrapper[4804]: I0217 13:58:03.574835 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:58:03 crc kubenswrapper[4804]: E0217 13:58:03.575299 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:58:16 crc kubenswrapper[4804]: I0217 13:58:16.583874 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:58:16 crc kubenswrapper[4804]: E0217 13:58:16.584676 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 13:58:28 crc kubenswrapper[4804]: I0217 13:58:28.038210 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ndx9s"] Feb 17 13:58:28 crc kubenswrapper[4804]: I0217 13:58:28.048623 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ndx9s"] Feb 17 13:58:28 crc kubenswrapper[4804]: I0217 13:58:28.575334 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 13:58:28 crc kubenswrapper[4804]: I0217 13:58:28.600249 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20c077b5-d559-4c19-b8ee-f1b7ebf3fc53" path="/var/lib/kubelet/pods/20c077b5-d559-4c19-b8ee-f1b7ebf3fc53/volumes" Feb 17 13:58:29 crc kubenswrapper[4804]: I0217 13:58:29.094783 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerStarted","Data":"8c6714459fe07a3c9e4e4659fffd6afce8b955eae8b1de9a8c8da55e663ec16f"} Feb 17 13:58:30 crc kubenswrapper[4804]: I0217 13:58:30.105724 4804 generic.go:334] "Generic (PLEG): container finished" podID="5ca70007-e938-4bd5-9f2a-66f18b87743a" containerID="61d3aa7b960b08aa92c348033cb6b61247c79bcc289b0a714c865d0e129fa428" exitCode=0 Feb 17 13:58:30 crc kubenswrapper[4804]: I0217 13:58:30.105833 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq" event={"ID":"5ca70007-e938-4bd5-9f2a-66f18b87743a","Type":"ContainerDied","Data":"61d3aa7b960b08aa92c348033cb6b61247c79bcc289b0a714c865d0e129fa428"} Feb 17 13:58:30 crc kubenswrapper[4804]: I0217 13:58:30.543551 4804 scope.go:117] "RemoveContainer" containerID="75003012d3c522e6a637465c31ac382126c2c3ac2eb1897adb68193823f330ce" Feb 17 13:58:30 crc kubenswrapper[4804]: I0217 13:58:30.566846 4804 scope.go:117] "RemoveContainer" containerID="62fe2cdf4668625c3cfd915d4fddf1e341b2d4a545fb2af5d424708a57a7a4a3" Feb 17 13:58:30 crc kubenswrapper[4804]: I0217 13:58:30.638988 4804 scope.go:117] "RemoveContainer" containerID="04848e079d7c3dd5aec9613ff12ec81fb185688c9c0af0d2f63039d17f192069" Feb 17 13:58:30 crc kubenswrapper[4804]: I0217 13:58:30.680453 4804 scope.go:117] "RemoveContainer" containerID="76d722774285224a6de60017eb8318c4877ef97f9d26d58e45fd8422945c25d0" Feb 17 13:58:30 crc kubenswrapper[4804]: I0217 13:58:30.736402 4804 scope.go:117] "RemoveContainer" containerID="feb7469a90aaa528b89392a82772cfa0640653aa5ae69effdca1ed55e8c2a1de" Feb 17 13:58:30 crc kubenswrapper[4804]: I0217 13:58:30.773556 4804 scope.go:117] "RemoveContainer" containerID="fc70787b15c0217130d5a14ec0e9948f9e8203a3e166dda3f2555ad7e07ed729" Feb 17 13:58:30 crc kubenswrapper[4804]: I0217 13:58:30.835739 4804 scope.go:117] "RemoveContainer" containerID="cd29054fcbff23437aedab7f24e705fc390169a8546254413b976c34b8bd4901" Feb 17 13:58:31 crc kubenswrapper[4804]: I0217 13:58:31.451146 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq" Feb 17 13:58:31 crc kubenswrapper[4804]: I0217 13:58:31.563455 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ca70007-e938-4bd5-9f2a-66f18b87743a-inventory\") pod \"5ca70007-e938-4bd5-9f2a-66f18b87743a\" (UID: \"5ca70007-e938-4bd5-9f2a-66f18b87743a\") " Feb 17 13:58:31 crc kubenswrapper[4804]: I0217 13:58:31.563817 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5ca70007-e938-4bd5-9f2a-66f18b87743a-ssh-key-openstack-edpm-ipam\") pod \"5ca70007-e938-4bd5-9f2a-66f18b87743a\" (UID: \"5ca70007-e938-4bd5-9f2a-66f18b87743a\") " Feb 17 13:58:31 crc kubenswrapper[4804]: I0217 13:58:31.564018 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfqfs\" (UniqueName: \"kubernetes.io/projected/5ca70007-e938-4bd5-9f2a-66f18b87743a-kube-api-access-pfqfs\") pod \"5ca70007-e938-4bd5-9f2a-66f18b87743a\" (UID: \"5ca70007-e938-4bd5-9f2a-66f18b87743a\") " Feb 17 13:58:31 crc kubenswrapper[4804]: I0217 13:58:31.570621 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ca70007-e938-4bd5-9f2a-66f18b87743a-kube-api-access-pfqfs" (OuterVolumeSpecName: "kube-api-access-pfqfs") pod "5ca70007-e938-4bd5-9f2a-66f18b87743a" (UID: "5ca70007-e938-4bd5-9f2a-66f18b87743a"). InnerVolumeSpecName "kube-api-access-pfqfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:58:31 crc kubenswrapper[4804]: I0217 13:58:31.591921 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ca70007-e938-4bd5-9f2a-66f18b87743a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5ca70007-e938-4bd5-9f2a-66f18b87743a" (UID: "5ca70007-e938-4bd5-9f2a-66f18b87743a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:58:31 crc kubenswrapper[4804]: I0217 13:58:31.604479 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ca70007-e938-4bd5-9f2a-66f18b87743a-inventory" (OuterVolumeSpecName: "inventory") pod "5ca70007-e938-4bd5-9f2a-66f18b87743a" (UID: "5ca70007-e938-4bd5-9f2a-66f18b87743a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:58:31 crc kubenswrapper[4804]: I0217 13:58:31.667460 4804 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5ca70007-e938-4bd5-9f2a-66f18b87743a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 13:58:31 crc kubenswrapper[4804]: I0217 13:58:31.667747 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfqfs\" (UniqueName: \"kubernetes.io/projected/5ca70007-e938-4bd5-9f2a-66f18b87743a-kube-api-access-pfqfs\") on node \"crc\" DevicePath \"\"" Feb 17 13:58:31 crc kubenswrapper[4804]: I0217 13:58:31.667763 4804 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ca70007-e938-4bd5-9f2a-66f18b87743a-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.126544 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq" event={"ID":"5ca70007-e938-4bd5-9f2a-66f18b87743a","Type":"ContainerDied","Data":"3c10d42299865df6258c432ca9ac58a243094430a1994c7efe5d12fe7c99a226"} Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.126591 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c10d42299865df6258c432ca9ac58a243094430a1994c7efe5d12fe7c99a226" Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.126621 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq" Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.207923 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-9jrnh"] Feb 17 13:58:32 crc kubenswrapper[4804]: E0217 13:58:32.208427 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ca70007-e938-4bd5-9f2a-66f18b87743a" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.208457 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ca70007-e938-4bd5-9f2a-66f18b87743a" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.208740 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ca70007-e938-4bd5-9f2a-66f18b87743a" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.209500 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9jrnh" Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.211963 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.211987 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-29gqz" Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.212163 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.212215 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.222142 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-9jrnh"] Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.279835 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/cdb9b3eb-f3d1-4a32-8a87-b0f686cad260-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-9jrnh\" (UID: \"cdb9b3eb-f3d1-4a32-8a87-b0f686cad260\") " pod="openstack/ssh-known-hosts-edpm-deployment-9jrnh" Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.279898 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cdb9b3eb-f3d1-4a32-8a87-b0f686cad260-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-9jrnh\" (UID: \"cdb9b3eb-f3d1-4a32-8a87-b0f686cad260\") " pod="openstack/ssh-known-hosts-edpm-deployment-9jrnh" Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.280119 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6fnv\" (UniqueName: \"kubernetes.io/projected/cdb9b3eb-f3d1-4a32-8a87-b0f686cad260-kube-api-access-n6fnv\") pod \"ssh-known-hosts-edpm-deployment-9jrnh\" (UID: \"cdb9b3eb-f3d1-4a32-8a87-b0f686cad260\") " pod="openstack/ssh-known-hosts-edpm-deployment-9jrnh" Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.380813 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6fnv\" (UniqueName: \"kubernetes.io/projected/cdb9b3eb-f3d1-4a32-8a87-b0f686cad260-kube-api-access-n6fnv\") pod \"ssh-known-hosts-edpm-deployment-9jrnh\" (UID: \"cdb9b3eb-f3d1-4a32-8a87-b0f686cad260\") " pod="openstack/ssh-known-hosts-edpm-deployment-9jrnh" Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.380917 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/cdb9b3eb-f3d1-4a32-8a87-b0f686cad260-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-9jrnh\" (UID: \"cdb9b3eb-f3d1-4a32-8a87-b0f686cad260\") " pod="openstack/ssh-known-hosts-edpm-deployment-9jrnh" Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.380969 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cdb9b3eb-f3d1-4a32-8a87-b0f686cad260-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-9jrnh\" (UID: \"cdb9b3eb-f3d1-4a32-8a87-b0f686cad260\") " pod="openstack/ssh-known-hosts-edpm-deployment-9jrnh" Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.392423 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/cdb9b3eb-f3d1-4a32-8a87-b0f686cad260-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-9jrnh\" (UID: \"cdb9b3eb-f3d1-4a32-8a87-b0f686cad260\") " pod="openstack/ssh-known-hosts-edpm-deployment-9jrnh" Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.392804 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cdb9b3eb-f3d1-4a32-8a87-b0f686cad260-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-9jrnh\" (UID: \"cdb9b3eb-f3d1-4a32-8a87-b0f686cad260\") " pod="openstack/ssh-known-hosts-edpm-deployment-9jrnh" Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.415464 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6fnv\" (UniqueName: \"kubernetes.io/projected/cdb9b3eb-f3d1-4a32-8a87-b0f686cad260-kube-api-access-n6fnv\") pod \"ssh-known-hosts-edpm-deployment-9jrnh\" (UID: \"cdb9b3eb-f3d1-4a32-8a87-b0f686cad260\") " pod="openstack/ssh-known-hosts-edpm-deployment-9jrnh" Feb 17 13:58:32 crc kubenswrapper[4804]: I0217 13:58:32.537185 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9jrnh" Feb 17 13:58:33 crc kubenswrapper[4804]: I0217 13:58:33.070990 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-9jrnh"] Feb 17 13:58:33 crc kubenswrapper[4804]: W0217 13:58:33.076495 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdb9b3eb_f3d1_4a32_8a87_b0f686cad260.slice/crio-da27589fd2de45de5953d509dcff69b29d0b85cd5476b598cbbe924ed8624492 WatchSource:0}: Error finding container da27589fd2de45de5953d509dcff69b29d0b85cd5476b598cbbe924ed8624492: Status 404 returned error can't find the container with id da27589fd2de45de5953d509dcff69b29d0b85cd5476b598cbbe924ed8624492 Feb 17 13:58:33 crc kubenswrapper[4804]: I0217 13:58:33.137719 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9jrnh" event={"ID":"cdb9b3eb-f3d1-4a32-8a87-b0f686cad260","Type":"ContainerStarted","Data":"da27589fd2de45de5953d509dcff69b29d0b85cd5476b598cbbe924ed8624492"} Feb 17 13:58:34 crc kubenswrapper[4804]: I0217 13:58:34.147898 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9jrnh" event={"ID":"cdb9b3eb-f3d1-4a32-8a87-b0f686cad260","Type":"ContainerStarted","Data":"8b9dc76207e8437c272b7a6756665cd4e57acdc64d44d4d72aea29e92acdf28b"} Feb 17 13:58:34 crc kubenswrapper[4804]: I0217 13:58:34.172107 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-9jrnh" podStartSLOduration=1.988925283 podStartE2EDuration="2.172086983s" podCreationTimestamp="2026-02-17 13:58:32 +0000 UTC" firstStartedPulling="2026-02-17 13:58:33.079829537 +0000 UTC m=+1987.191248874" lastFinishedPulling="2026-02-17 13:58:33.262991197 +0000 UTC m=+1987.374410574" observedRunningTime="2026-02-17 13:58:34.165192787 +0000 UTC m=+1988.276612124" watchObservedRunningTime="2026-02-17 13:58:34.172086983 +0000 UTC m=+1988.283506310" Feb 17 13:58:40 crc kubenswrapper[4804]: I0217 13:58:40.199131 4804 generic.go:334] "Generic (PLEG): container finished" podID="cdb9b3eb-f3d1-4a32-8a87-b0f686cad260" containerID="8b9dc76207e8437c272b7a6756665cd4e57acdc64d44d4d72aea29e92acdf28b" exitCode=0 Feb 17 13:58:40 crc kubenswrapper[4804]: I0217 13:58:40.199227 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9jrnh" event={"ID":"cdb9b3eb-f3d1-4a32-8a87-b0f686cad260","Type":"ContainerDied","Data":"8b9dc76207e8437c272b7a6756665cd4e57acdc64d44d4d72aea29e92acdf28b"} Feb 17 13:58:41 crc kubenswrapper[4804]: I0217 13:58:41.652697 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9jrnh" Feb 17 13:58:41 crc kubenswrapper[4804]: I0217 13:58:41.763504 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cdb9b3eb-f3d1-4a32-8a87-b0f686cad260-ssh-key-openstack-edpm-ipam\") pod \"cdb9b3eb-f3d1-4a32-8a87-b0f686cad260\" (UID: \"cdb9b3eb-f3d1-4a32-8a87-b0f686cad260\") " Feb 17 13:58:41 crc kubenswrapper[4804]: I0217 13:58:41.763572 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/cdb9b3eb-f3d1-4a32-8a87-b0f686cad260-inventory-0\") pod \"cdb9b3eb-f3d1-4a32-8a87-b0f686cad260\" (UID: \"cdb9b3eb-f3d1-4a32-8a87-b0f686cad260\") " Feb 17 13:58:41 crc kubenswrapper[4804]: I0217 13:58:41.763644 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6fnv\" (UniqueName: \"kubernetes.io/projected/cdb9b3eb-f3d1-4a32-8a87-b0f686cad260-kube-api-access-n6fnv\") pod \"cdb9b3eb-f3d1-4a32-8a87-b0f686cad260\" (UID: \"cdb9b3eb-f3d1-4a32-8a87-b0f686cad260\") " Feb 17 13:58:41 crc kubenswrapper[4804]: I0217 13:58:41.770963 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdb9b3eb-f3d1-4a32-8a87-b0f686cad260-kube-api-access-n6fnv" (OuterVolumeSpecName: "kube-api-access-n6fnv") pod "cdb9b3eb-f3d1-4a32-8a87-b0f686cad260" (UID: "cdb9b3eb-f3d1-4a32-8a87-b0f686cad260"). InnerVolumeSpecName "kube-api-access-n6fnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:58:41 crc kubenswrapper[4804]: I0217 13:58:41.792976 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb9b3eb-f3d1-4a32-8a87-b0f686cad260-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "cdb9b3eb-f3d1-4a32-8a87-b0f686cad260" (UID: "cdb9b3eb-f3d1-4a32-8a87-b0f686cad260"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:58:41 crc kubenswrapper[4804]: I0217 13:58:41.806760 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb9b3eb-f3d1-4a32-8a87-b0f686cad260-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "cdb9b3eb-f3d1-4a32-8a87-b0f686cad260" (UID: "cdb9b3eb-f3d1-4a32-8a87-b0f686cad260"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:58:41 crc kubenswrapper[4804]: I0217 13:58:41.867853 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6fnv\" (UniqueName: \"kubernetes.io/projected/cdb9b3eb-f3d1-4a32-8a87-b0f686cad260-kube-api-access-n6fnv\") on node \"crc\" DevicePath \"\"" Feb 17 13:58:41 crc kubenswrapper[4804]: I0217 13:58:41.867889 4804 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cdb9b3eb-f3d1-4a32-8a87-b0f686cad260-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 13:58:41 crc kubenswrapper[4804]: I0217 13:58:41.867902 4804 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/cdb9b3eb-f3d1-4a32-8a87-b0f686cad260-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.225018 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9jrnh" event={"ID":"cdb9b3eb-f3d1-4a32-8a87-b0f686cad260","Type":"ContainerDied","Data":"da27589fd2de45de5953d509dcff69b29d0b85cd5476b598cbbe924ed8624492"} Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.225178 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da27589fd2de45de5953d509dcff69b29d0b85cd5476b598cbbe924ed8624492" Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.225257 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9jrnh" Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.292857 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c"] Feb 17 13:58:42 crc kubenswrapper[4804]: E0217 13:58:42.293393 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdb9b3eb-f3d1-4a32-8a87-b0f686cad260" containerName="ssh-known-hosts-edpm-deployment" Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.293416 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdb9b3eb-f3d1-4a32-8a87-b0f686cad260" containerName="ssh-known-hosts-edpm-deployment" Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.293693 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdb9b3eb-f3d1-4a32-8a87-b0f686cad260" containerName="ssh-known-hosts-edpm-deployment" Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.294479 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c" Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.298051 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.298427 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.302628 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.302827 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-29gqz" Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.309852 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c"] Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.547943 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdt28\" (UniqueName: \"kubernetes.io/projected/01fe0e44-6604-4e17-bcb4-05f202508fc7-kube-api-access-pdt28\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rf97c\" (UID: \"01fe0e44-6604-4e17-bcb4-05f202508fc7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c" Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.548144 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01fe0e44-6604-4e17-bcb4-05f202508fc7-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rf97c\" (UID: \"01fe0e44-6604-4e17-bcb4-05f202508fc7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c" Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.548245 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/01fe0e44-6604-4e17-bcb4-05f202508fc7-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rf97c\" (UID: \"01fe0e44-6604-4e17-bcb4-05f202508fc7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c" Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.650047 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdt28\" (UniqueName: \"kubernetes.io/projected/01fe0e44-6604-4e17-bcb4-05f202508fc7-kube-api-access-pdt28\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rf97c\" (UID: \"01fe0e44-6604-4e17-bcb4-05f202508fc7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c" Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.650164 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01fe0e44-6604-4e17-bcb4-05f202508fc7-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rf97c\" (UID: \"01fe0e44-6604-4e17-bcb4-05f202508fc7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c" Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.650259 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/01fe0e44-6604-4e17-bcb4-05f202508fc7-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rf97c\" (UID: \"01fe0e44-6604-4e17-bcb4-05f202508fc7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c" Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.654067 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01fe0e44-6604-4e17-bcb4-05f202508fc7-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rf97c\" (UID: \"01fe0e44-6604-4e17-bcb4-05f202508fc7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c" Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.663380 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/01fe0e44-6604-4e17-bcb4-05f202508fc7-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rf97c\" (UID: \"01fe0e44-6604-4e17-bcb4-05f202508fc7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c" Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.668506 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdt28\" (UniqueName: \"kubernetes.io/projected/01fe0e44-6604-4e17-bcb4-05f202508fc7-kube-api-access-pdt28\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rf97c\" (UID: \"01fe0e44-6604-4e17-bcb4-05f202508fc7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c" Feb 17 13:58:42 crc kubenswrapper[4804]: I0217 13:58:42.752158 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c" Feb 17 13:58:43 crc kubenswrapper[4804]: I0217 13:58:43.275170 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c"] Feb 17 13:58:43 crc kubenswrapper[4804]: W0217 13:58:43.277446 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01fe0e44_6604_4e17_bcb4_05f202508fc7.slice/crio-b43806aa1776752507ca661a4f947fe1a0861dacaed6dbab9863a5bf750461cb WatchSource:0}: Error finding container b43806aa1776752507ca661a4f947fe1a0861dacaed6dbab9863a5bf750461cb: Status 404 returned error can't find the container with id b43806aa1776752507ca661a4f947fe1a0861dacaed6dbab9863a5bf750461cb Feb 17 13:58:44 crc kubenswrapper[4804]: I0217 13:58:44.245461 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c" event={"ID":"01fe0e44-6604-4e17-bcb4-05f202508fc7","Type":"ContainerStarted","Data":"22cd54d160fc1141491cfd3f3d7de9401f89d305d60f451c7f9ab79a452f96fc"} Feb 17 13:58:44 crc kubenswrapper[4804]: I0217 13:58:44.245759 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c" event={"ID":"01fe0e44-6604-4e17-bcb4-05f202508fc7","Type":"ContainerStarted","Data":"b43806aa1776752507ca661a4f947fe1a0861dacaed6dbab9863a5bf750461cb"} Feb 17 13:58:44 crc kubenswrapper[4804]: I0217 13:58:44.273240 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c" podStartSLOduration=2.021755222 podStartE2EDuration="2.273195945s" podCreationTimestamp="2026-02-17 13:58:42 +0000 UTC" firstStartedPulling="2026-02-17 13:58:43.279125861 +0000 UTC m=+1997.390545198" lastFinishedPulling="2026-02-17 13:58:43.530566584 +0000 UTC m=+1997.641985921" observedRunningTime="2026-02-17 13:58:44.262781359 +0000 UTC m=+1998.374200716" watchObservedRunningTime="2026-02-17 13:58:44.273195945 +0000 UTC m=+1998.384615282" Feb 17 13:58:47 crc kubenswrapper[4804]: I0217 13:58:47.041945 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-pmp8r"] Feb 17 13:58:47 crc kubenswrapper[4804]: I0217 13:58:47.051828 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-pmp8r"] Feb 17 13:58:48 crc kubenswrapper[4804]: I0217 13:58:48.587591 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6597adc7-fdae-4de0-99bc-87d9807f38f4" path="/var/lib/kubelet/pods/6597adc7-fdae-4de0-99bc-87d9807f38f4/volumes" Feb 17 13:58:51 crc kubenswrapper[4804]: E0217 13:58:51.768703 4804 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01fe0e44_6604_4e17_bcb4_05f202508fc7.slice/crio-conmon-22cd54d160fc1141491cfd3f3d7de9401f89d305d60f451c7f9ab79a452f96fc.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01fe0e44_6604_4e17_bcb4_05f202508fc7.slice/crio-22cd54d160fc1141491cfd3f3d7de9401f89d305d60f451c7f9ab79a452f96fc.scope\": RecentStats: unable to find data in memory cache]" Feb 17 13:58:52 crc kubenswrapper[4804]: I0217 13:58:52.034459 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wq5kj"] Feb 17 13:58:52 crc kubenswrapper[4804]: I0217 13:58:52.044061 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wq5kj"] Feb 17 13:58:52 crc kubenswrapper[4804]: I0217 13:58:52.331429 4804 generic.go:334] "Generic (PLEG): container finished" podID="01fe0e44-6604-4e17-bcb4-05f202508fc7" containerID="22cd54d160fc1141491cfd3f3d7de9401f89d305d60f451c7f9ab79a452f96fc" exitCode=0 Feb 17 13:58:52 crc kubenswrapper[4804]: I0217 13:58:52.331483 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c" event={"ID":"01fe0e44-6604-4e17-bcb4-05f202508fc7","Type":"ContainerDied","Data":"22cd54d160fc1141491cfd3f3d7de9401f89d305d60f451c7f9ab79a452f96fc"} Feb 17 13:58:52 crc kubenswrapper[4804]: I0217 13:58:52.586586 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c11e165e-2605-470a-a865-230b274ce8d3" path="/var/lib/kubelet/pods/c11e165e-2605-470a-a865-230b274ce8d3/volumes" Feb 17 13:58:53 crc kubenswrapper[4804]: I0217 13:58:53.715610 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c" Feb 17 13:58:53 crc kubenswrapper[4804]: I0217 13:58:53.769684 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01fe0e44-6604-4e17-bcb4-05f202508fc7-inventory\") pod \"01fe0e44-6604-4e17-bcb4-05f202508fc7\" (UID: \"01fe0e44-6604-4e17-bcb4-05f202508fc7\") " Feb 17 13:58:53 crc kubenswrapper[4804]: I0217 13:58:53.769851 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/01fe0e44-6604-4e17-bcb4-05f202508fc7-ssh-key-openstack-edpm-ipam\") pod \"01fe0e44-6604-4e17-bcb4-05f202508fc7\" (UID: \"01fe0e44-6604-4e17-bcb4-05f202508fc7\") " Feb 17 13:58:53 crc kubenswrapper[4804]: I0217 13:58:53.769888 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdt28\" (UniqueName: \"kubernetes.io/projected/01fe0e44-6604-4e17-bcb4-05f202508fc7-kube-api-access-pdt28\") pod \"01fe0e44-6604-4e17-bcb4-05f202508fc7\" (UID: \"01fe0e44-6604-4e17-bcb4-05f202508fc7\") " Feb 17 13:58:53 crc kubenswrapper[4804]: I0217 13:58:53.776973 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01fe0e44-6604-4e17-bcb4-05f202508fc7-kube-api-access-pdt28" (OuterVolumeSpecName: "kube-api-access-pdt28") pod "01fe0e44-6604-4e17-bcb4-05f202508fc7" (UID: "01fe0e44-6604-4e17-bcb4-05f202508fc7"). InnerVolumeSpecName "kube-api-access-pdt28". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:58:53 crc kubenswrapper[4804]: I0217 13:58:53.802165 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01fe0e44-6604-4e17-bcb4-05f202508fc7-inventory" (OuterVolumeSpecName: "inventory") pod "01fe0e44-6604-4e17-bcb4-05f202508fc7" (UID: "01fe0e44-6604-4e17-bcb4-05f202508fc7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:58:53 crc kubenswrapper[4804]: I0217 13:58:53.802788 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01fe0e44-6604-4e17-bcb4-05f202508fc7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "01fe0e44-6604-4e17-bcb4-05f202508fc7" (UID: "01fe0e44-6604-4e17-bcb4-05f202508fc7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:58:53 crc kubenswrapper[4804]: I0217 13:58:53.871155 4804 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/01fe0e44-6604-4e17-bcb4-05f202508fc7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 13:58:53 crc kubenswrapper[4804]: I0217 13:58:53.871209 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdt28\" (UniqueName: \"kubernetes.io/projected/01fe0e44-6604-4e17-bcb4-05f202508fc7-kube-api-access-pdt28\") on node \"crc\" DevicePath \"\"" Feb 17 13:58:53 crc kubenswrapper[4804]: I0217 13:58:53.871221 4804 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01fe0e44-6604-4e17-bcb4-05f202508fc7-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.350981 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c" event={"ID":"01fe0e44-6604-4e17-bcb4-05f202508fc7","Type":"ContainerDied","Data":"b43806aa1776752507ca661a4f947fe1a0861dacaed6dbab9863a5bf750461cb"} Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.351032 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b43806aa1776752507ca661a4f947fe1a0861dacaed6dbab9863a5bf750461cb" Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.351037 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf97c" Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.438724 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66"] Feb 17 13:58:54 crc kubenswrapper[4804]: E0217 13:58:54.439238 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01fe0e44-6604-4e17-bcb4-05f202508fc7" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.439256 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="01fe0e44-6604-4e17-bcb4-05f202508fc7" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.439480 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="01fe0e44-6604-4e17-bcb4-05f202508fc7" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.440067 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66" Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.446786 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66"] Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.448115 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.448493 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.448701 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.448839 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-29gqz" Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.493451 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/100d84c5-396c-4772-af09-2e223e72a640-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66\" (UID: \"100d84c5-396c-4772-af09-2e223e72a640\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66" Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.493523 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/100d84c5-396c-4772-af09-2e223e72a640-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66\" (UID: \"100d84c5-396c-4772-af09-2e223e72a640\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66" Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.493764 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s4b7\" (UniqueName: \"kubernetes.io/projected/100d84c5-396c-4772-af09-2e223e72a640-kube-api-access-9s4b7\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66\" (UID: \"100d84c5-396c-4772-af09-2e223e72a640\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66" Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.595237 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s4b7\" (UniqueName: \"kubernetes.io/projected/100d84c5-396c-4772-af09-2e223e72a640-kube-api-access-9s4b7\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66\" (UID: \"100d84c5-396c-4772-af09-2e223e72a640\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66" Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.595294 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/100d84c5-396c-4772-af09-2e223e72a640-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66\" (UID: \"100d84c5-396c-4772-af09-2e223e72a640\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66" Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.595760 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/100d84c5-396c-4772-af09-2e223e72a640-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66\" (UID: \"100d84c5-396c-4772-af09-2e223e72a640\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66" Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.600225 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/100d84c5-396c-4772-af09-2e223e72a640-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66\" (UID: \"100d84c5-396c-4772-af09-2e223e72a640\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66" Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.602740 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/100d84c5-396c-4772-af09-2e223e72a640-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66\" (UID: \"100d84c5-396c-4772-af09-2e223e72a640\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66" Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.616708 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s4b7\" (UniqueName: \"kubernetes.io/projected/100d84c5-396c-4772-af09-2e223e72a640-kube-api-access-9s4b7\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66\" (UID: \"100d84c5-396c-4772-af09-2e223e72a640\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66" Feb 17 13:58:54 crc kubenswrapper[4804]: I0217 13:58:54.802565 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66" Feb 17 13:58:55 crc kubenswrapper[4804]: I0217 13:58:55.307605 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66"] Feb 17 13:58:55 crc kubenswrapper[4804]: I0217 13:58:55.371474 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66" event={"ID":"100d84c5-396c-4772-af09-2e223e72a640","Type":"ContainerStarted","Data":"5b78661fdc285bf6f05049d4a9d9f5cf1f82874131daffa67decbbaa3d1036e7"} Feb 17 13:58:56 crc kubenswrapper[4804]: I0217 13:58:56.384031 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66" event={"ID":"100d84c5-396c-4772-af09-2e223e72a640","Type":"ContainerStarted","Data":"36eea98e6310a84677647c1fb3714e8c2a397adf495a6da21f1cabe7f0c0a0b7"} Feb 17 13:58:56 crc kubenswrapper[4804]: I0217 13:58:56.404856 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66" podStartSLOduration=2.217252199 podStartE2EDuration="2.404837978s" podCreationTimestamp="2026-02-17 13:58:54 +0000 UTC" firstStartedPulling="2026-02-17 13:58:55.316270018 +0000 UTC m=+2009.427689355" lastFinishedPulling="2026-02-17 13:58:55.503855797 +0000 UTC m=+2009.615275134" observedRunningTime="2026-02-17 13:58:56.403175945 +0000 UTC m=+2010.514595292" watchObservedRunningTime="2026-02-17 13:58:56.404837978 +0000 UTC m=+2010.516257315" Feb 17 13:59:04 crc kubenswrapper[4804]: I0217 13:59:04.456841 4804 generic.go:334] "Generic (PLEG): container finished" podID="100d84c5-396c-4772-af09-2e223e72a640" containerID="36eea98e6310a84677647c1fb3714e8c2a397adf495a6da21f1cabe7f0c0a0b7" exitCode=0 Feb 17 13:59:04 crc kubenswrapper[4804]: I0217 13:59:04.456948 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66" event={"ID":"100d84c5-396c-4772-af09-2e223e72a640","Type":"ContainerDied","Data":"36eea98e6310a84677647c1fb3714e8c2a397adf495a6da21f1cabe7f0c0a0b7"} Feb 17 13:59:05 crc kubenswrapper[4804]: I0217 13:59:05.844819 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66" Feb 17 13:59:05 crc kubenswrapper[4804]: I0217 13:59:05.925075 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/100d84c5-396c-4772-af09-2e223e72a640-inventory\") pod \"100d84c5-396c-4772-af09-2e223e72a640\" (UID: \"100d84c5-396c-4772-af09-2e223e72a640\") " Feb 17 13:59:05 crc kubenswrapper[4804]: I0217 13:59:05.925154 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9s4b7\" (UniqueName: \"kubernetes.io/projected/100d84c5-396c-4772-af09-2e223e72a640-kube-api-access-9s4b7\") pod \"100d84c5-396c-4772-af09-2e223e72a640\" (UID: \"100d84c5-396c-4772-af09-2e223e72a640\") " Feb 17 13:59:05 crc kubenswrapper[4804]: I0217 13:59:05.925255 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/100d84c5-396c-4772-af09-2e223e72a640-ssh-key-openstack-edpm-ipam\") pod \"100d84c5-396c-4772-af09-2e223e72a640\" (UID: \"100d84c5-396c-4772-af09-2e223e72a640\") " Feb 17 13:59:05 crc kubenswrapper[4804]: I0217 13:59:05.931434 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/100d84c5-396c-4772-af09-2e223e72a640-kube-api-access-9s4b7" (OuterVolumeSpecName: "kube-api-access-9s4b7") pod "100d84c5-396c-4772-af09-2e223e72a640" (UID: "100d84c5-396c-4772-af09-2e223e72a640"). InnerVolumeSpecName "kube-api-access-9s4b7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:59:05 crc kubenswrapper[4804]: I0217 13:59:05.956335 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/100d84c5-396c-4772-af09-2e223e72a640-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "100d84c5-396c-4772-af09-2e223e72a640" (UID: "100d84c5-396c-4772-af09-2e223e72a640"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:59:05 crc kubenswrapper[4804]: I0217 13:59:05.957288 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/100d84c5-396c-4772-af09-2e223e72a640-inventory" (OuterVolumeSpecName: "inventory") pod "100d84c5-396c-4772-af09-2e223e72a640" (UID: "100d84c5-396c-4772-af09-2e223e72a640"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.028317 4804 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/100d84c5-396c-4772-af09-2e223e72a640-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.028397 4804 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/100d84c5-396c-4772-af09-2e223e72a640-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.028414 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9s4b7\" (UniqueName: \"kubernetes.io/projected/100d84c5-396c-4772-af09-2e223e72a640-kube-api-access-9s4b7\") on node \"crc\" DevicePath \"\"" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.476174 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66" event={"ID":"100d84c5-396c-4772-af09-2e223e72a640","Type":"ContainerDied","Data":"5b78661fdc285bf6f05049d4a9d9f5cf1f82874131daffa67decbbaa3d1036e7"} Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.476228 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b78661fdc285bf6f05049d4a9d9f5cf1f82874131daffa67decbbaa3d1036e7" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.476264 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.588041 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8"] Feb 17 13:59:06 crc kubenswrapper[4804]: E0217 13:59:06.588715 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="100d84c5-396c-4772-af09-2e223e72a640" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.588732 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="100d84c5-396c-4772-af09-2e223e72a640" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.588935 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="100d84c5-396c-4772-af09-2e223e72a640" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.590435 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8"] Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.590541 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.593496 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.593813 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.594018 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.594280 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.594828 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.595174 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-29gqz" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.595371 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.595872 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.741066 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.741124 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.741178 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs62s\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-kube-api-access-fs62s\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.741244 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.741284 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.741322 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.741408 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.741435 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.741455 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.741502 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.741525 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.741559 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.741584 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.741608 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.843435 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.843515 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.843586 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.843620 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.843643 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.843693 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.843725 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.843763 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.843788 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.843811 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.843857 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.843884 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.843911 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fs62s\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-kube-api-access-fs62s\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.843936 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.848175 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.848838 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.849211 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.850475 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.850700 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.850938 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.851010 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.851093 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.852052 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.852674 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.852710 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.854163 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.855323 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.863461 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs62s\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-kube-api-access-fs62s\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-65nc8\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:06 crc kubenswrapper[4804]: I0217 13:59:06.907431 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:07 crc kubenswrapper[4804]: I0217 13:59:07.417632 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8"] Feb 17 13:59:07 crc kubenswrapper[4804]: I0217 13:59:07.484506 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" event={"ID":"0a55b597-4920-4fa6-99d5-6deaa6f30a4a","Type":"ContainerStarted","Data":"4e7c98968a7dfaeb3b3af000332cc3d28899bc087d1522fecd51ab062f8851da"} Feb 17 13:59:08 crc kubenswrapper[4804]: I0217 13:59:08.492247 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" event={"ID":"0a55b597-4920-4fa6-99d5-6deaa6f30a4a","Type":"ContainerStarted","Data":"b6be46f9b30dbef9a223cc19fe2b815a0349e906da67e5e489219a565cceb442"} Feb 17 13:59:08 crc kubenswrapper[4804]: I0217 13:59:08.515480 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" podStartSLOduration=2.346224595 podStartE2EDuration="2.515459958s" podCreationTimestamp="2026-02-17 13:59:06 +0000 UTC" firstStartedPulling="2026-02-17 13:59:07.419788325 +0000 UTC m=+2021.531207662" lastFinishedPulling="2026-02-17 13:59:07.589023688 +0000 UTC m=+2021.700443025" observedRunningTime="2026-02-17 13:59:08.508301883 +0000 UTC m=+2022.619721220" watchObservedRunningTime="2026-02-17 13:59:08.515459958 +0000 UTC m=+2022.626879285" Feb 17 13:59:30 crc kubenswrapper[4804]: I0217 13:59:30.045272 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-s8qtz"] Feb 17 13:59:30 crc kubenswrapper[4804]: I0217 13:59:30.053047 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-s8qtz"] Feb 17 13:59:30 crc kubenswrapper[4804]: I0217 13:59:30.587108 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b6d06cb-8252-4c27-815b-1f09a217cbb4" path="/var/lib/kubelet/pods/0b6d06cb-8252-4c27-815b-1f09a217cbb4/volumes" Feb 17 13:59:31 crc kubenswrapper[4804]: I0217 13:59:31.029852 4804 scope.go:117] "RemoveContainer" containerID="24aef71ff922a8ddea4d7c3429161120ea76c5281b5a5f51b9b913d40e9cb137" Feb 17 13:59:31 crc kubenswrapper[4804]: I0217 13:59:31.067038 4804 scope.go:117] "RemoveContainer" containerID="29efb5e0a9decba15d04c2ad76b8438da8424bb8f92bf46c981df4cb056e18f6" Feb 17 13:59:31 crc kubenswrapper[4804]: I0217 13:59:31.123398 4804 scope.go:117] "RemoveContainer" containerID="3edaad49062f52adf5c7194a9baff45d7b6f8571728650127dd710028add6529" Feb 17 13:59:43 crc kubenswrapper[4804]: I0217 13:59:43.811376 4804 generic.go:334] "Generic (PLEG): container finished" podID="0a55b597-4920-4fa6-99d5-6deaa6f30a4a" containerID="b6be46f9b30dbef9a223cc19fe2b815a0349e906da67e5e489219a565cceb442" exitCode=0 Feb 17 13:59:43 crc kubenswrapper[4804]: I0217 13:59:43.811493 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" event={"ID":"0a55b597-4920-4fa6-99d5-6deaa6f30a4a","Type":"ContainerDied","Data":"b6be46f9b30dbef9a223cc19fe2b815a0349e906da67e5e489219a565cceb442"} Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.338279 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.469327 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-neutron-metadata-combined-ca-bundle\") pod \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.469420 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-ovn-default-certs-0\") pod \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.469445 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-repo-setup-combined-ca-bundle\") pod \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.469468 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.469489 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.469512 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-bootstrap-combined-ca-bundle\") pod \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.469536 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-nova-combined-ca-bundle\") pod \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.469625 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-ovn-combined-ca-bundle\") pod \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.469781 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fs62s\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-kube-api-access-fs62s\") pod \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.469901 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-inventory\") pod \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.470090 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-telemetry-combined-ca-bundle\") pod \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.470125 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-libvirt-combined-ca-bundle\") pod \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.470487 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-ssh-key-openstack-edpm-ipam\") pod \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.470568 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\" (UID: \"0a55b597-4920-4fa6-99d5-6deaa6f30a4a\") " Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.476681 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "0a55b597-4920-4fa6-99d5-6deaa6f30a4a" (UID: "0a55b597-4920-4fa6-99d5-6deaa6f30a4a"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.481661 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "0a55b597-4920-4fa6-99d5-6deaa6f30a4a" (UID: "0a55b597-4920-4fa6-99d5-6deaa6f30a4a"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.481804 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "0a55b597-4920-4fa6-99d5-6deaa6f30a4a" (UID: "0a55b597-4920-4fa6-99d5-6deaa6f30a4a"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.483108 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "0a55b597-4920-4fa6-99d5-6deaa6f30a4a" (UID: "0a55b597-4920-4fa6-99d5-6deaa6f30a4a"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.483145 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "0a55b597-4920-4fa6-99d5-6deaa6f30a4a" (UID: "0a55b597-4920-4fa6-99d5-6deaa6f30a4a"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.483156 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "0a55b597-4920-4fa6-99d5-6deaa6f30a4a" (UID: "0a55b597-4920-4fa6-99d5-6deaa6f30a4a"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.483179 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "0a55b597-4920-4fa6-99d5-6deaa6f30a4a" (UID: "0a55b597-4920-4fa6-99d5-6deaa6f30a4a"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.483188 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "0a55b597-4920-4fa6-99d5-6deaa6f30a4a" (UID: "0a55b597-4920-4fa6-99d5-6deaa6f30a4a"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.483262 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "0a55b597-4920-4fa6-99d5-6deaa6f30a4a" (UID: "0a55b597-4920-4fa6-99d5-6deaa6f30a4a"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.483359 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "0a55b597-4920-4fa6-99d5-6deaa6f30a4a" (UID: "0a55b597-4920-4fa6-99d5-6deaa6f30a4a"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.483418 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-kube-api-access-fs62s" (OuterVolumeSpecName: "kube-api-access-fs62s") pod "0a55b597-4920-4fa6-99d5-6deaa6f30a4a" (UID: "0a55b597-4920-4fa6-99d5-6deaa6f30a4a"). InnerVolumeSpecName "kube-api-access-fs62s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.486793 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "0a55b597-4920-4fa6-99d5-6deaa6f30a4a" (UID: "0a55b597-4920-4fa6-99d5-6deaa6f30a4a"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.505064 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-inventory" (OuterVolumeSpecName: "inventory") pod "0a55b597-4920-4fa6-99d5-6deaa6f30a4a" (UID: "0a55b597-4920-4fa6-99d5-6deaa6f30a4a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.514936 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0a55b597-4920-4fa6-99d5-6deaa6f30a4a" (UID: "0a55b597-4920-4fa6-99d5-6deaa6f30a4a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.572931 4804 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.572995 4804 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.573010 4804 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.573027 4804 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.573041 4804 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.573054 4804 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.573070 4804 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.573083 4804 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.573097 4804 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.573111 4804 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.573125 4804 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.573136 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fs62s\" (UniqueName: \"kubernetes.io/projected/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-kube-api-access-fs62s\") on node \"crc\" DevicePath \"\"" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.573147 4804 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.573157 4804 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a55b597-4920-4fa6-99d5-6deaa6f30a4a-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.845715 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" event={"ID":"0a55b597-4920-4fa6-99d5-6deaa6f30a4a","Type":"ContainerDied","Data":"4e7c98968a7dfaeb3b3af000332cc3d28899bc087d1522fecd51ab062f8851da"} Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.845767 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e7c98968a7dfaeb3b3af000332cc3d28899bc087d1522fecd51ab062f8851da" Feb 17 13:59:46 crc kubenswrapper[4804]: I0217 13:59:46.845775 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-65nc8" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.448144 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m"] Feb 17 13:59:47 crc kubenswrapper[4804]: E0217 13:59:47.448541 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a55b597-4920-4fa6-99d5-6deaa6f30a4a" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.448554 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a55b597-4920-4fa6-99d5-6deaa6f30a4a" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.448738 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a55b597-4920-4fa6-99d5-6deaa6f30a4a" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.449314 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.451453 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.451497 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.451777 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.451889 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-29gqz" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.458929 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.466410 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m"] Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.560433 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be98213b-0510-4f69-9d98-81363c04d8bd-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v478m\" (UID: \"be98213b-0510-4f69-9d98-81363c04d8bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.561045 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/be98213b-0510-4f69-9d98-81363c04d8bd-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v478m\" (UID: \"be98213b-0510-4f69-9d98-81363c04d8bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.561188 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be98213b-0510-4f69-9d98-81363c04d8bd-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v478m\" (UID: \"be98213b-0510-4f69-9d98-81363c04d8bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.561486 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/be98213b-0510-4f69-9d98-81363c04d8bd-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v478m\" (UID: \"be98213b-0510-4f69-9d98-81363c04d8bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.561636 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz2qv\" (UniqueName: \"kubernetes.io/projected/be98213b-0510-4f69-9d98-81363c04d8bd-kube-api-access-tz2qv\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v478m\" (UID: \"be98213b-0510-4f69-9d98-81363c04d8bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.663480 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/be98213b-0510-4f69-9d98-81363c04d8bd-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v478m\" (UID: \"be98213b-0510-4f69-9d98-81363c04d8bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.663588 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz2qv\" (UniqueName: \"kubernetes.io/projected/be98213b-0510-4f69-9d98-81363c04d8bd-kube-api-access-tz2qv\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v478m\" (UID: \"be98213b-0510-4f69-9d98-81363c04d8bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.663701 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be98213b-0510-4f69-9d98-81363c04d8bd-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v478m\" (UID: \"be98213b-0510-4f69-9d98-81363c04d8bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.663743 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/be98213b-0510-4f69-9d98-81363c04d8bd-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v478m\" (UID: \"be98213b-0510-4f69-9d98-81363c04d8bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.663775 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be98213b-0510-4f69-9d98-81363c04d8bd-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v478m\" (UID: \"be98213b-0510-4f69-9d98-81363c04d8bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.664846 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/be98213b-0510-4f69-9d98-81363c04d8bd-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v478m\" (UID: \"be98213b-0510-4f69-9d98-81363c04d8bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.669621 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/be98213b-0510-4f69-9d98-81363c04d8bd-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v478m\" (UID: \"be98213b-0510-4f69-9d98-81363c04d8bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.669762 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be98213b-0510-4f69-9d98-81363c04d8bd-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v478m\" (UID: \"be98213b-0510-4f69-9d98-81363c04d8bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.670142 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be98213b-0510-4f69-9d98-81363c04d8bd-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v478m\" (UID: \"be98213b-0510-4f69-9d98-81363c04d8bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.679489 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz2qv\" (UniqueName: \"kubernetes.io/projected/be98213b-0510-4f69-9d98-81363c04d8bd-kube-api-access-tz2qv\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v478m\" (UID: \"be98213b-0510-4f69-9d98-81363c04d8bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" Feb 17 13:59:47 crc kubenswrapper[4804]: I0217 13:59:47.766531 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" Feb 17 13:59:48 crc kubenswrapper[4804]: I0217 13:59:48.375519 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m"] Feb 17 13:59:48 crc kubenswrapper[4804]: I0217 13:59:48.973876 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" event={"ID":"be98213b-0510-4f69-9d98-81363c04d8bd","Type":"ContainerStarted","Data":"8e3443e0a50b60470ef93ad0d7e6c63fd03c0873cfe2fa3786abfaf905be2422"} Feb 17 13:59:49 crc kubenswrapper[4804]: I0217 13:59:49.985242 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" event={"ID":"be98213b-0510-4f69-9d98-81363c04d8bd","Type":"ContainerStarted","Data":"12c7596730ce7431db3737b621eacbf5768ff35bc98e48dfcb2ddd7465e4e588"} Feb 17 13:59:50 crc kubenswrapper[4804]: I0217 13:59:50.090023 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" podStartSLOduration=2.898249229 podStartE2EDuration="3.089999888s" podCreationTimestamp="2026-02-17 13:59:47 +0000 UTC" firstStartedPulling="2026-02-17 13:59:48.376602804 +0000 UTC m=+2062.488022151" lastFinishedPulling="2026-02-17 13:59:48.568353473 +0000 UTC m=+2062.679772810" observedRunningTime="2026-02-17 13:59:50.080362835 +0000 UTC m=+2064.191782172" watchObservedRunningTime="2026-02-17 13:59:50.089999888 +0000 UTC m=+2064.201419235" Feb 17 14:00:00 crc kubenswrapper[4804]: I0217 14:00:00.142094 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522280-9z9gc"] Feb 17 14:00:00 crc kubenswrapper[4804]: I0217 14:00:00.144649 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-9z9gc" Feb 17 14:00:00 crc kubenswrapper[4804]: I0217 14:00:00.150608 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 14:00:00 crc kubenswrapper[4804]: I0217 14:00:00.150817 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 14:00:00 crc kubenswrapper[4804]: I0217 14:00:00.156349 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522280-9z9gc"] Feb 17 14:00:00 crc kubenswrapper[4804]: I0217 14:00:00.280384 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e5860044-8a05-47fc-848e-fe988543fbe6-secret-volume\") pod \"collect-profiles-29522280-9z9gc\" (UID: \"e5860044-8a05-47fc-848e-fe988543fbe6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-9z9gc" Feb 17 14:00:00 crc kubenswrapper[4804]: I0217 14:00:00.280553 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5860044-8a05-47fc-848e-fe988543fbe6-config-volume\") pod \"collect-profiles-29522280-9z9gc\" (UID: \"e5860044-8a05-47fc-848e-fe988543fbe6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-9z9gc" Feb 17 14:00:00 crc kubenswrapper[4804]: I0217 14:00:00.280807 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmnwv\" (UniqueName: \"kubernetes.io/projected/e5860044-8a05-47fc-848e-fe988543fbe6-kube-api-access-xmnwv\") pod \"collect-profiles-29522280-9z9gc\" (UID: \"e5860044-8a05-47fc-848e-fe988543fbe6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-9z9gc" Feb 17 14:00:00 crc kubenswrapper[4804]: I0217 14:00:00.383125 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5860044-8a05-47fc-848e-fe988543fbe6-config-volume\") pod \"collect-profiles-29522280-9z9gc\" (UID: \"e5860044-8a05-47fc-848e-fe988543fbe6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-9z9gc" Feb 17 14:00:00 crc kubenswrapper[4804]: I0217 14:00:00.383413 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmnwv\" (UniqueName: \"kubernetes.io/projected/e5860044-8a05-47fc-848e-fe988543fbe6-kube-api-access-xmnwv\") pod \"collect-profiles-29522280-9z9gc\" (UID: \"e5860044-8a05-47fc-848e-fe988543fbe6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-9z9gc" Feb 17 14:00:00 crc kubenswrapper[4804]: I0217 14:00:00.383598 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e5860044-8a05-47fc-848e-fe988543fbe6-secret-volume\") pod \"collect-profiles-29522280-9z9gc\" (UID: \"e5860044-8a05-47fc-848e-fe988543fbe6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-9z9gc" Feb 17 14:00:00 crc kubenswrapper[4804]: I0217 14:00:00.385740 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5860044-8a05-47fc-848e-fe988543fbe6-config-volume\") pod \"collect-profiles-29522280-9z9gc\" (UID: \"e5860044-8a05-47fc-848e-fe988543fbe6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-9z9gc" Feb 17 14:00:00 crc kubenswrapper[4804]: I0217 14:00:00.390188 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e5860044-8a05-47fc-848e-fe988543fbe6-secret-volume\") pod \"collect-profiles-29522280-9z9gc\" (UID: \"e5860044-8a05-47fc-848e-fe988543fbe6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-9z9gc" Feb 17 14:00:00 crc kubenswrapper[4804]: I0217 14:00:00.410398 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmnwv\" (UniqueName: \"kubernetes.io/projected/e5860044-8a05-47fc-848e-fe988543fbe6-kube-api-access-xmnwv\") pod \"collect-profiles-29522280-9z9gc\" (UID: \"e5860044-8a05-47fc-848e-fe988543fbe6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-9z9gc" Feb 17 14:00:00 crc kubenswrapper[4804]: I0217 14:00:00.479750 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-9z9gc" Feb 17 14:00:00 crc kubenswrapper[4804]: I0217 14:00:00.938861 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522280-9z9gc"] Feb 17 14:00:01 crc kubenswrapper[4804]: I0217 14:00:01.081569 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-9z9gc" event={"ID":"e5860044-8a05-47fc-848e-fe988543fbe6","Type":"ContainerStarted","Data":"3085cbf7ec604fa269706abe77a1c4626eb24d5b8d18de0183c58e390762011f"} Feb 17 14:00:02 crc kubenswrapper[4804]: I0217 14:00:02.098281 4804 generic.go:334] "Generic (PLEG): container finished" podID="e5860044-8a05-47fc-848e-fe988543fbe6" containerID="65a25413b82f8ebba6f197969e84840573a778c78b72f9b376f4a3d6b1b0329b" exitCode=0 Feb 17 14:00:02 crc kubenswrapper[4804]: I0217 14:00:02.098378 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-9z9gc" event={"ID":"e5860044-8a05-47fc-848e-fe988543fbe6","Type":"ContainerDied","Data":"65a25413b82f8ebba6f197969e84840573a778c78b72f9b376f4a3d6b1b0329b"} Feb 17 14:00:03 crc kubenswrapper[4804]: I0217 14:00:03.465914 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-9z9gc" Feb 17 14:00:03 crc kubenswrapper[4804]: I0217 14:00:03.552984 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmnwv\" (UniqueName: \"kubernetes.io/projected/e5860044-8a05-47fc-848e-fe988543fbe6-kube-api-access-xmnwv\") pod \"e5860044-8a05-47fc-848e-fe988543fbe6\" (UID: \"e5860044-8a05-47fc-848e-fe988543fbe6\") " Feb 17 14:00:03 crc kubenswrapper[4804]: I0217 14:00:03.553090 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e5860044-8a05-47fc-848e-fe988543fbe6-secret-volume\") pod \"e5860044-8a05-47fc-848e-fe988543fbe6\" (UID: \"e5860044-8a05-47fc-848e-fe988543fbe6\") " Feb 17 14:00:03 crc kubenswrapper[4804]: I0217 14:00:03.553252 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5860044-8a05-47fc-848e-fe988543fbe6-config-volume\") pod \"e5860044-8a05-47fc-848e-fe988543fbe6\" (UID: \"e5860044-8a05-47fc-848e-fe988543fbe6\") " Feb 17 14:00:03 crc kubenswrapper[4804]: I0217 14:00:03.554231 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5860044-8a05-47fc-848e-fe988543fbe6-config-volume" (OuterVolumeSpecName: "config-volume") pod "e5860044-8a05-47fc-848e-fe988543fbe6" (UID: "e5860044-8a05-47fc-848e-fe988543fbe6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:00:03 crc kubenswrapper[4804]: I0217 14:00:03.558862 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5860044-8a05-47fc-848e-fe988543fbe6-kube-api-access-xmnwv" (OuterVolumeSpecName: "kube-api-access-xmnwv") pod "e5860044-8a05-47fc-848e-fe988543fbe6" (UID: "e5860044-8a05-47fc-848e-fe988543fbe6"). InnerVolumeSpecName "kube-api-access-xmnwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:00:03 crc kubenswrapper[4804]: I0217 14:00:03.563983 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5860044-8a05-47fc-848e-fe988543fbe6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e5860044-8a05-47fc-848e-fe988543fbe6" (UID: "e5860044-8a05-47fc-848e-fe988543fbe6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:00:03 crc kubenswrapper[4804]: I0217 14:00:03.655979 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmnwv\" (UniqueName: \"kubernetes.io/projected/e5860044-8a05-47fc-848e-fe988543fbe6-kube-api-access-xmnwv\") on node \"crc\" DevicePath \"\"" Feb 17 14:00:03 crc kubenswrapper[4804]: I0217 14:00:03.656017 4804 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e5860044-8a05-47fc-848e-fe988543fbe6-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 14:00:03 crc kubenswrapper[4804]: I0217 14:00:03.656026 4804 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5860044-8a05-47fc-848e-fe988543fbe6-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 14:00:04 crc kubenswrapper[4804]: I0217 14:00:04.115420 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-9z9gc" event={"ID":"e5860044-8a05-47fc-848e-fe988543fbe6","Type":"ContainerDied","Data":"3085cbf7ec604fa269706abe77a1c4626eb24d5b8d18de0183c58e390762011f"} Feb 17 14:00:04 crc kubenswrapper[4804]: I0217 14:00:04.115453 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-9z9gc" Feb 17 14:00:04 crc kubenswrapper[4804]: I0217 14:00:04.115463 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3085cbf7ec604fa269706abe77a1c4626eb24d5b8d18de0183c58e390762011f" Feb 17 14:00:04 crc kubenswrapper[4804]: I0217 14:00:04.545476 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8"] Feb 17 14:00:04 crc kubenswrapper[4804]: I0217 14:00:04.556428 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522235-lnbg8"] Feb 17 14:00:04 crc kubenswrapper[4804]: I0217 14:00:04.584825 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3768c453-c58d-4768-9620-a202cbb8ccd8" path="/var/lib/kubelet/pods/3768c453-c58d-4768-9620-a202cbb8ccd8/volumes" Feb 17 14:00:31 crc kubenswrapper[4804]: I0217 14:00:31.240288 4804 scope.go:117] "RemoveContainer" containerID="4162bfeb135a23379531aee533539dfb67782c33b11814b5fe4b4ead4443c227" Feb 17 14:00:44 crc kubenswrapper[4804]: I0217 14:00:44.474994 4804 generic.go:334] "Generic (PLEG): container finished" podID="be98213b-0510-4f69-9d98-81363c04d8bd" containerID="12c7596730ce7431db3737b621eacbf5768ff35bc98e48dfcb2ddd7465e4e588" exitCode=0 Feb 17 14:00:44 crc kubenswrapper[4804]: I0217 14:00:44.475075 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" event={"ID":"be98213b-0510-4f69-9d98-81363c04d8bd","Type":"ContainerDied","Data":"12c7596730ce7431db3737b621eacbf5768ff35bc98e48dfcb2ddd7465e4e588"} Feb 17 14:00:45 crc kubenswrapper[4804]: I0217 14:00:45.891860 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.094043 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/be98213b-0510-4f69-9d98-81363c04d8bd-ovncontroller-config-0\") pod \"be98213b-0510-4f69-9d98-81363c04d8bd\" (UID: \"be98213b-0510-4f69-9d98-81363c04d8bd\") " Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.094429 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/be98213b-0510-4f69-9d98-81363c04d8bd-ssh-key-openstack-edpm-ipam\") pod \"be98213b-0510-4f69-9d98-81363c04d8bd\" (UID: \"be98213b-0510-4f69-9d98-81363c04d8bd\") " Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.094629 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be98213b-0510-4f69-9d98-81363c04d8bd-ovn-combined-ca-bundle\") pod \"be98213b-0510-4f69-9d98-81363c04d8bd\" (UID: \"be98213b-0510-4f69-9d98-81363c04d8bd\") " Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.094862 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tz2qv\" (UniqueName: \"kubernetes.io/projected/be98213b-0510-4f69-9d98-81363c04d8bd-kube-api-access-tz2qv\") pod \"be98213b-0510-4f69-9d98-81363c04d8bd\" (UID: \"be98213b-0510-4f69-9d98-81363c04d8bd\") " Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.095249 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be98213b-0510-4f69-9d98-81363c04d8bd-inventory\") pod \"be98213b-0510-4f69-9d98-81363c04d8bd\" (UID: \"be98213b-0510-4f69-9d98-81363c04d8bd\") " Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.100918 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be98213b-0510-4f69-9d98-81363c04d8bd-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "be98213b-0510-4f69-9d98-81363c04d8bd" (UID: "be98213b-0510-4f69-9d98-81363c04d8bd"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.104450 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be98213b-0510-4f69-9d98-81363c04d8bd-kube-api-access-tz2qv" (OuterVolumeSpecName: "kube-api-access-tz2qv") pod "be98213b-0510-4f69-9d98-81363c04d8bd" (UID: "be98213b-0510-4f69-9d98-81363c04d8bd"). InnerVolumeSpecName "kube-api-access-tz2qv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.127395 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be98213b-0510-4f69-9d98-81363c04d8bd-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "be98213b-0510-4f69-9d98-81363c04d8bd" (UID: "be98213b-0510-4f69-9d98-81363c04d8bd"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.136799 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be98213b-0510-4f69-9d98-81363c04d8bd-inventory" (OuterVolumeSpecName: "inventory") pod "be98213b-0510-4f69-9d98-81363c04d8bd" (UID: "be98213b-0510-4f69-9d98-81363c04d8bd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.157088 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be98213b-0510-4f69-9d98-81363c04d8bd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "be98213b-0510-4f69-9d98-81363c04d8bd" (UID: "be98213b-0510-4f69-9d98-81363c04d8bd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.198348 4804 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be98213b-0510-4f69-9d98-81363c04d8bd-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.198404 4804 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/be98213b-0510-4f69-9d98-81363c04d8bd-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.198420 4804 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/be98213b-0510-4f69-9d98-81363c04d8bd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.198432 4804 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be98213b-0510-4f69-9d98-81363c04d8bd-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.198444 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tz2qv\" (UniqueName: \"kubernetes.io/projected/be98213b-0510-4f69-9d98-81363c04d8bd-kube-api-access-tz2qv\") on node \"crc\" DevicePath \"\"" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.499644 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" event={"ID":"be98213b-0510-4f69-9d98-81363c04d8bd","Type":"ContainerDied","Data":"8e3443e0a50b60470ef93ad0d7e6c63fd03c0873cfe2fa3786abfaf905be2422"} Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.499893 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e3443e0a50b60470ef93ad0d7e6c63fd03c0873cfe2fa3786abfaf905be2422" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.500985 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v478m" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.624452 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg"] Feb 17 14:00:46 crc kubenswrapper[4804]: E0217 14:00:46.624905 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5860044-8a05-47fc-848e-fe988543fbe6" containerName="collect-profiles" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.624926 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5860044-8a05-47fc-848e-fe988543fbe6" containerName="collect-profiles" Feb 17 14:00:46 crc kubenswrapper[4804]: E0217 14:00:46.624952 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be98213b-0510-4f69-9d98-81363c04d8bd" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.624959 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="be98213b-0510-4f69-9d98-81363c04d8bd" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.625151 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5860044-8a05-47fc-848e-fe988543fbe6" containerName="collect-profiles" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.625182 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="be98213b-0510-4f69-9d98-81363c04d8bd" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.625874 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.636959 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.636983 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.637205 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.637279 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.639710 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.639920 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-29gqz" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.673997 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg"] Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.709181 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.709243 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.709270 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.709302 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.709337 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.709355 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fbwf\" (UniqueName: \"kubernetes.io/projected/84938cd5-694c-423a-a0d1-801f28085377-kube-api-access-8fbwf\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.811172 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.811242 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.811278 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.811322 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.811368 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.811395 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fbwf\" (UniqueName: \"kubernetes.io/projected/84938cd5-694c-423a-a0d1-801f28085377-kube-api-access-8fbwf\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.822986 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.824897 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.827745 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.839896 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.855050 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.860068 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fbwf\" (UniqueName: \"kubernetes.io/projected/84938cd5-694c-423a-a0d1-801f28085377-kube-api-access-8fbwf\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:00:46 crc kubenswrapper[4804]: I0217 14:00:46.953693 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:00:47 crc kubenswrapper[4804]: W0217 14:00:47.557729 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84938cd5_694c_423a_a0d1_801f28085377.slice/crio-fa1090751bbd8922101933ec4bbfcd3f26a54132de8e8a491da47a32bd612c8d WatchSource:0}: Error finding container fa1090751bbd8922101933ec4bbfcd3f26a54132de8e8a491da47a32bd612c8d: Status 404 returned error can't find the container with id fa1090751bbd8922101933ec4bbfcd3f26a54132de8e8a491da47a32bd612c8d Feb 17 14:00:47 crc kubenswrapper[4804]: I0217 14:00:47.565400 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg"] Feb 17 14:00:48 crc kubenswrapper[4804]: I0217 14:00:48.517287 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" event={"ID":"84938cd5-694c-423a-a0d1-801f28085377","Type":"ContainerStarted","Data":"dca45023c7c97a7a87b586cb70d296ad9987cc9764180a5e3b59f9fa0e2be83c"} Feb 17 14:00:48 crc kubenswrapper[4804]: I0217 14:00:48.517631 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" event={"ID":"84938cd5-694c-423a-a0d1-801f28085377","Type":"ContainerStarted","Data":"fa1090751bbd8922101933ec4bbfcd3f26a54132de8e8a491da47a32bd612c8d"} Feb 17 14:00:48 crc kubenswrapper[4804]: I0217 14:00:48.543412 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" podStartSLOduration=2.332133852 podStartE2EDuration="2.543387757s" podCreationTimestamp="2026-02-17 14:00:46 +0000 UTC" firstStartedPulling="2026-02-17 14:00:47.561411365 +0000 UTC m=+2121.672830702" lastFinishedPulling="2026-02-17 14:00:47.77266527 +0000 UTC m=+2121.884084607" observedRunningTime="2026-02-17 14:00:48.537168302 +0000 UTC m=+2122.648587649" watchObservedRunningTime="2026-02-17 14:00:48.543387757 +0000 UTC m=+2122.654807094" Feb 17 14:00:55 crc kubenswrapper[4804]: I0217 14:00:55.835108 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:00:55 crc kubenswrapper[4804]: I0217 14:00:55.835571 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:01:00 crc kubenswrapper[4804]: I0217 14:01:00.140518 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29522281-k9ptv"] Feb 17 14:01:00 crc kubenswrapper[4804]: I0217 14:01:00.143474 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29522281-k9ptv" Feb 17 14:01:00 crc kubenswrapper[4804]: I0217 14:01:00.175505 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29522281-k9ptv"] Feb 17 14:01:00 crc kubenswrapper[4804]: I0217 14:01:00.281785 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2d1f319-5d08-4969-a968-45eba20958a7-config-data\") pod \"keystone-cron-29522281-k9ptv\" (UID: \"c2d1f319-5d08-4969-a968-45eba20958a7\") " pod="openstack/keystone-cron-29522281-k9ptv" Feb 17 14:01:00 crc kubenswrapper[4804]: I0217 14:01:00.281822 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qbqj\" (UniqueName: \"kubernetes.io/projected/c2d1f319-5d08-4969-a968-45eba20958a7-kube-api-access-4qbqj\") pod \"keystone-cron-29522281-k9ptv\" (UID: \"c2d1f319-5d08-4969-a968-45eba20958a7\") " pod="openstack/keystone-cron-29522281-k9ptv" Feb 17 14:01:00 crc kubenswrapper[4804]: I0217 14:01:00.281932 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2d1f319-5d08-4969-a968-45eba20958a7-combined-ca-bundle\") pod \"keystone-cron-29522281-k9ptv\" (UID: \"c2d1f319-5d08-4969-a968-45eba20958a7\") " pod="openstack/keystone-cron-29522281-k9ptv" Feb 17 14:01:00 crc kubenswrapper[4804]: I0217 14:01:00.281970 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c2d1f319-5d08-4969-a968-45eba20958a7-fernet-keys\") pod \"keystone-cron-29522281-k9ptv\" (UID: \"c2d1f319-5d08-4969-a968-45eba20958a7\") " pod="openstack/keystone-cron-29522281-k9ptv" Feb 17 14:01:00 crc kubenswrapper[4804]: I0217 14:01:00.383634 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2d1f319-5d08-4969-a968-45eba20958a7-combined-ca-bundle\") pod \"keystone-cron-29522281-k9ptv\" (UID: \"c2d1f319-5d08-4969-a968-45eba20958a7\") " pod="openstack/keystone-cron-29522281-k9ptv" Feb 17 14:01:00 crc kubenswrapper[4804]: I0217 14:01:00.383754 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c2d1f319-5d08-4969-a968-45eba20958a7-fernet-keys\") pod \"keystone-cron-29522281-k9ptv\" (UID: \"c2d1f319-5d08-4969-a968-45eba20958a7\") " pod="openstack/keystone-cron-29522281-k9ptv" Feb 17 14:01:00 crc kubenswrapper[4804]: I0217 14:01:00.383848 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2d1f319-5d08-4969-a968-45eba20958a7-config-data\") pod \"keystone-cron-29522281-k9ptv\" (UID: \"c2d1f319-5d08-4969-a968-45eba20958a7\") " pod="openstack/keystone-cron-29522281-k9ptv" Feb 17 14:01:00 crc kubenswrapper[4804]: I0217 14:01:00.383882 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qbqj\" (UniqueName: \"kubernetes.io/projected/c2d1f319-5d08-4969-a968-45eba20958a7-kube-api-access-4qbqj\") pod \"keystone-cron-29522281-k9ptv\" (UID: \"c2d1f319-5d08-4969-a968-45eba20958a7\") " pod="openstack/keystone-cron-29522281-k9ptv" Feb 17 14:01:00 crc kubenswrapper[4804]: I0217 14:01:00.390040 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2d1f319-5d08-4969-a968-45eba20958a7-combined-ca-bundle\") pod \"keystone-cron-29522281-k9ptv\" (UID: \"c2d1f319-5d08-4969-a968-45eba20958a7\") " pod="openstack/keystone-cron-29522281-k9ptv" Feb 17 14:01:00 crc kubenswrapper[4804]: I0217 14:01:00.390748 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2d1f319-5d08-4969-a968-45eba20958a7-config-data\") pod \"keystone-cron-29522281-k9ptv\" (UID: \"c2d1f319-5d08-4969-a968-45eba20958a7\") " pod="openstack/keystone-cron-29522281-k9ptv" Feb 17 14:01:00 crc kubenswrapper[4804]: I0217 14:01:00.391435 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c2d1f319-5d08-4969-a968-45eba20958a7-fernet-keys\") pod \"keystone-cron-29522281-k9ptv\" (UID: \"c2d1f319-5d08-4969-a968-45eba20958a7\") " pod="openstack/keystone-cron-29522281-k9ptv" Feb 17 14:01:00 crc kubenswrapper[4804]: I0217 14:01:00.402360 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qbqj\" (UniqueName: \"kubernetes.io/projected/c2d1f319-5d08-4969-a968-45eba20958a7-kube-api-access-4qbqj\") pod \"keystone-cron-29522281-k9ptv\" (UID: \"c2d1f319-5d08-4969-a968-45eba20958a7\") " pod="openstack/keystone-cron-29522281-k9ptv" Feb 17 14:01:00 crc kubenswrapper[4804]: I0217 14:01:00.484577 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29522281-k9ptv" Feb 17 14:01:01 crc kubenswrapper[4804]: I0217 14:01:01.003354 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29522281-k9ptv"] Feb 17 14:01:01 crc kubenswrapper[4804]: I0217 14:01:01.644108 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29522281-k9ptv" event={"ID":"c2d1f319-5d08-4969-a968-45eba20958a7","Type":"ContainerStarted","Data":"deed3aaa8676f3c0d8f2143f71bdec1c0dc234dca7c6bcf52a241acbff2f9e66"} Feb 17 14:01:01 crc kubenswrapper[4804]: I0217 14:01:01.644418 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29522281-k9ptv" event={"ID":"c2d1f319-5d08-4969-a968-45eba20958a7","Type":"ContainerStarted","Data":"b10580fc3079f4e1ccce270ed3c03f975f755bd6366f63cb48dee4c22f68f194"} Feb 17 14:01:01 crc kubenswrapper[4804]: I0217 14:01:01.667820 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29522281-k9ptv" podStartSLOduration=1.667800275 podStartE2EDuration="1.667800275s" podCreationTimestamp="2026-02-17 14:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:01:01.65870319 +0000 UTC m=+2135.770122537" watchObservedRunningTime="2026-02-17 14:01:01.667800275 +0000 UTC m=+2135.779219612" Feb 17 14:01:03 crc kubenswrapper[4804]: I0217 14:01:03.665825 4804 generic.go:334] "Generic (PLEG): container finished" podID="c2d1f319-5d08-4969-a968-45eba20958a7" containerID="deed3aaa8676f3c0d8f2143f71bdec1c0dc234dca7c6bcf52a241acbff2f9e66" exitCode=0 Feb 17 14:01:03 crc kubenswrapper[4804]: I0217 14:01:03.665916 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29522281-k9ptv" event={"ID":"c2d1f319-5d08-4969-a968-45eba20958a7","Type":"ContainerDied","Data":"deed3aaa8676f3c0d8f2143f71bdec1c0dc234dca7c6bcf52a241acbff2f9e66"} Feb 17 14:01:05 crc kubenswrapper[4804]: I0217 14:01:05.025484 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29522281-k9ptv" Feb 17 14:01:05 crc kubenswrapper[4804]: I0217 14:01:05.176536 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2d1f319-5d08-4969-a968-45eba20958a7-combined-ca-bundle\") pod \"c2d1f319-5d08-4969-a968-45eba20958a7\" (UID: \"c2d1f319-5d08-4969-a968-45eba20958a7\") " Feb 17 14:01:05 crc kubenswrapper[4804]: I0217 14:01:05.176791 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c2d1f319-5d08-4969-a968-45eba20958a7-fernet-keys\") pod \"c2d1f319-5d08-4969-a968-45eba20958a7\" (UID: \"c2d1f319-5d08-4969-a968-45eba20958a7\") " Feb 17 14:01:05 crc kubenswrapper[4804]: I0217 14:01:05.176976 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qbqj\" (UniqueName: \"kubernetes.io/projected/c2d1f319-5d08-4969-a968-45eba20958a7-kube-api-access-4qbqj\") pod \"c2d1f319-5d08-4969-a968-45eba20958a7\" (UID: \"c2d1f319-5d08-4969-a968-45eba20958a7\") " Feb 17 14:01:05 crc kubenswrapper[4804]: I0217 14:01:05.177058 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2d1f319-5d08-4969-a968-45eba20958a7-config-data\") pod \"c2d1f319-5d08-4969-a968-45eba20958a7\" (UID: \"c2d1f319-5d08-4969-a968-45eba20958a7\") " Feb 17 14:01:05 crc kubenswrapper[4804]: I0217 14:01:05.181959 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2d1f319-5d08-4969-a968-45eba20958a7-kube-api-access-4qbqj" (OuterVolumeSpecName: "kube-api-access-4qbqj") pod "c2d1f319-5d08-4969-a968-45eba20958a7" (UID: "c2d1f319-5d08-4969-a968-45eba20958a7"). InnerVolumeSpecName "kube-api-access-4qbqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:01:05 crc kubenswrapper[4804]: I0217 14:01:05.182395 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2d1f319-5d08-4969-a968-45eba20958a7-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c2d1f319-5d08-4969-a968-45eba20958a7" (UID: "c2d1f319-5d08-4969-a968-45eba20958a7"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:01:05 crc kubenswrapper[4804]: I0217 14:01:05.205570 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2d1f319-5d08-4969-a968-45eba20958a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2d1f319-5d08-4969-a968-45eba20958a7" (UID: "c2d1f319-5d08-4969-a968-45eba20958a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:01:05 crc kubenswrapper[4804]: I0217 14:01:05.236392 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2d1f319-5d08-4969-a968-45eba20958a7-config-data" (OuterVolumeSpecName: "config-data") pod "c2d1f319-5d08-4969-a968-45eba20958a7" (UID: "c2d1f319-5d08-4969-a968-45eba20958a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:01:05 crc kubenswrapper[4804]: I0217 14:01:05.279676 4804 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2d1f319-5d08-4969-a968-45eba20958a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:01:05 crc kubenswrapper[4804]: I0217 14:01:05.279714 4804 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c2d1f319-5d08-4969-a968-45eba20958a7-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 17 14:01:05 crc kubenswrapper[4804]: I0217 14:01:05.279723 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qbqj\" (UniqueName: \"kubernetes.io/projected/c2d1f319-5d08-4969-a968-45eba20958a7-kube-api-access-4qbqj\") on node \"crc\" DevicePath \"\"" Feb 17 14:01:05 crc kubenswrapper[4804]: I0217 14:01:05.279734 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2d1f319-5d08-4969-a968-45eba20958a7-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:01:05 crc kubenswrapper[4804]: I0217 14:01:05.682991 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29522281-k9ptv" event={"ID":"c2d1f319-5d08-4969-a968-45eba20958a7","Type":"ContainerDied","Data":"b10580fc3079f4e1ccce270ed3c03f975f755bd6366f63cb48dee4c22f68f194"} Feb 17 14:01:05 crc kubenswrapper[4804]: I0217 14:01:05.683425 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b10580fc3079f4e1ccce270ed3c03f975f755bd6366f63cb48dee4c22f68f194" Feb 17 14:01:05 crc kubenswrapper[4804]: I0217 14:01:05.683487 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29522281-k9ptv" Feb 17 14:01:25 crc kubenswrapper[4804]: I0217 14:01:25.835275 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:01:25 crc kubenswrapper[4804]: I0217 14:01:25.835883 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:01:31 crc kubenswrapper[4804]: I0217 14:01:31.961143 4804 generic.go:334] "Generic (PLEG): container finished" podID="84938cd5-694c-423a-a0d1-801f28085377" containerID="dca45023c7c97a7a87b586cb70d296ad9987cc9764180a5e3b59f9fa0e2be83c" exitCode=0 Feb 17 14:01:31 crc kubenswrapper[4804]: I0217 14:01:31.961274 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" event={"ID":"84938cd5-694c-423a-a0d1-801f28085377","Type":"ContainerDied","Data":"dca45023c7c97a7a87b586cb70d296ad9987cc9764180a5e3b59f9fa0e2be83c"} Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.428292 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.479168 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-neutron-ovn-metadata-agent-neutron-config-0\") pod \"84938cd5-694c-423a-a0d1-801f28085377\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.479236 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-ssh-key-openstack-edpm-ipam\") pod \"84938cd5-694c-423a-a0d1-801f28085377\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.479287 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-nova-metadata-neutron-config-0\") pod \"84938cd5-694c-423a-a0d1-801f28085377\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.479350 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-neutron-metadata-combined-ca-bundle\") pod \"84938cd5-694c-423a-a0d1-801f28085377\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.479373 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fbwf\" (UniqueName: \"kubernetes.io/projected/84938cd5-694c-423a-a0d1-801f28085377-kube-api-access-8fbwf\") pod \"84938cd5-694c-423a-a0d1-801f28085377\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.479530 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-inventory\") pod \"84938cd5-694c-423a-a0d1-801f28085377\" (UID: \"84938cd5-694c-423a-a0d1-801f28085377\") " Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.487223 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84938cd5-694c-423a-a0d1-801f28085377-kube-api-access-8fbwf" (OuterVolumeSpecName: "kube-api-access-8fbwf") pod "84938cd5-694c-423a-a0d1-801f28085377" (UID: "84938cd5-694c-423a-a0d1-801f28085377"). InnerVolumeSpecName "kube-api-access-8fbwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.490678 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "84938cd5-694c-423a-a0d1-801f28085377" (UID: "84938cd5-694c-423a-a0d1-801f28085377"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.507732 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "84938cd5-694c-423a-a0d1-801f28085377" (UID: "84938cd5-694c-423a-a0d1-801f28085377"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.511659 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-inventory" (OuterVolumeSpecName: "inventory") pod "84938cd5-694c-423a-a0d1-801f28085377" (UID: "84938cd5-694c-423a-a0d1-801f28085377"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.521189 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "84938cd5-694c-423a-a0d1-801f28085377" (UID: "84938cd5-694c-423a-a0d1-801f28085377"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.525781 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "84938cd5-694c-423a-a0d1-801f28085377" (UID: "84938cd5-694c-423a-a0d1-801f28085377"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.582055 4804 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.582113 4804 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.582130 4804 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.582146 4804 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.582160 4804 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84938cd5-694c-423a-a0d1-801f28085377-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.582174 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fbwf\" (UniqueName: \"kubernetes.io/projected/84938cd5-694c-423a-a0d1-801f28085377-kube-api-access-8fbwf\") on node \"crc\" DevicePath \"\"" Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.982181 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" event={"ID":"84938cd5-694c-423a-a0d1-801f28085377","Type":"ContainerDied","Data":"fa1090751bbd8922101933ec4bbfcd3f26a54132de8e8a491da47a32bd612c8d"} Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.982236 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg" Feb 17 14:01:33 crc kubenswrapper[4804]: I0217 14:01:33.982243 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa1090751bbd8922101933ec4bbfcd3f26a54132de8e8a491da47a32bd612c8d" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.081355 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc"] Feb 17 14:01:34 crc kubenswrapper[4804]: E0217 14:01:34.081876 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2d1f319-5d08-4969-a968-45eba20958a7" containerName="keystone-cron" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.081898 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2d1f319-5d08-4969-a968-45eba20958a7" containerName="keystone-cron" Feb 17 14:01:34 crc kubenswrapper[4804]: E0217 14:01:34.081933 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84938cd5-694c-423a-a0d1-801f28085377" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.081946 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="84938cd5-694c-423a-a0d1-801f28085377" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.082188 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="84938cd5-694c-423a-a0d1-801f28085377" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.082228 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2d1f319-5d08-4969-a968-45eba20958a7" containerName="keystone-cron" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.083064 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.085327 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.085940 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.178886 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-29gqz" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.179256 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.179721 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.191159 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc"] Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.285345 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bzfk\" (UniqueName: \"kubernetes.io/projected/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-kube-api-access-6bzfk\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc\" (UID: \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.285408 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc\" (UID: \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.285452 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc\" (UID: \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.285704 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc\" (UID: \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.286065 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc\" (UID: \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.387923 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc\" (UID: \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.388308 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bzfk\" (UniqueName: \"kubernetes.io/projected/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-kube-api-access-6bzfk\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc\" (UID: \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.388345 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc\" (UID: \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.388379 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc\" (UID: \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.388429 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc\" (UID: \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.392408 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc\" (UID: \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.393479 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc\" (UID: \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.394327 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc\" (UID: \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.410396 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc\" (UID: \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.414912 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bzfk\" (UniqueName: \"kubernetes.io/projected/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-kube-api-access-6bzfk\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc\" (UID: \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" Feb 17 14:01:34 crc kubenswrapper[4804]: I0217 14:01:34.511240 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" Feb 17 14:01:35 crc kubenswrapper[4804]: I0217 14:01:35.006471 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc"] Feb 17 14:01:35 crc kubenswrapper[4804]: I0217 14:01:35.011066 4804 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 14:01:36 crc kubenswrapper[4804]: I0217 14:01:36.001464 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" event={"ID":"c0aad2ba-98cf-42b5-9c03-40633fb8ac18","Type":"ContainerStarted","Data":"8ce565034da923c62aa35b8a82d937d994fde79d28a308a124ad2648ce45eeca"} Feb 17 14:01:36 crc kubenswrapper[4804]: I0217 14:01:36.003537 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" event={"ID":"c0aad2ba-98cf-42b5-9c03-40633fb8ac18","Type":"ContainerStarted","Data":"b85edd5f6c172e3fc7186590e69abc45c66e58877217d9c682e3a1e6773d16ec"} Feb 17 14:01:50 crc kubenswrapper[4804]: I0217 14:01:50.852558 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" podStartSLOduration=16.679926228 podStartE2EDuration="16.852535604s" podCreationTimestamp="2026-02-17 14:01:34 +0000 UTC" firstStartedPulling="2026-02-17 14:01:35.01087148 +0000 UTC m=+2169.122290817" lastFinishedPulling="2026-02-17 14:01:35.183480856 +0000 UTC m=+2169.294900193" observedRunningTime="2026-02-17 14:01:36.023976287 +0000 UTC m=+2170.135395634" watchObservedRunningTime="2026-02-17 14:01:50.852535604 +0000 UTC m=+2184.963954941" Feb 17 14:01:50 crc kubenswrapper[4804]: I0217 14:01:50.859081 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-24vs5"] Feb 17 14:01:50 crc kubenswrapper[4804]: I0217 14:01:50.861288 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-24vs5" Feb 17 14:01:50 crc kubenswrapper[4804]: I0217 14:01:50.874010 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-24vs5"] Feb 17 14:01:50 crc kubenswrapper[4804]: I0217 14:01:50.925234 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a890ce7-1c49-42c1-8158-a8bb6df28bca-catalog-content\") pod \"redhat-marketplace-24vs5\" (UID: \"5a890ce7-1c49-42c1-8158-a8bb6df28bca\") " pod="openshift-marketplace/redhat-marketplace-24vs5" Feb 17 14:01:50 crc kubenswrapper[4804]: I0217 14:01:50.925614 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqxxq\" (UniqueName: \"kubernetes.io/projected/5a890ce7-1c49-42c1-8158-a8bb6df28bca-kube-api-access-vqxxq\") pod \"redhat-marketplace-24vs5\" (UID: \"5a890ce7-1c49-42c1-8158-a8bb6df28bca\") " pod="openshift-marketplace/redhat-marketplace-24vs5" Feb 17 14:01:50 crc kubenswrapper[4804]: I0217 14:01:50.925832 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a890ce7-1c49-42c1-8158-a8bb6df28bca-utilities\") pod \"redhat-marketplace-24vs5\" (UID: \"5a890ce7-1c49-42c1-8158-a8bb6df28bca\") " pod="openshift-marketplace/redhat-marketplace-24vs5" Feb 17 14:01:51 crc kubenswrapper[4804]: I0217 14:01:51.027774 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqxxq\" (UniqueName: \"kubernetes.io/projected/5a890ce7-1c49-42c1-8158-a8bb6df28bca-kube-api-access-vqxxq\") pod \"redhat-marketplace-24vs5\" (UID: \"5a890ce7-1c49-42c1-8158-a8bb6df28bca\") " pod="openshift-marketplace/redhat-marketplace-24vs5" Feb 17 14:01:51 crc kubenswrapper[4804]: I0217 14:01:51.028036 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a890ce7-1c49-42c1-8158-a8bb6df28bca-utilities\") pod \"redhat-marketplace-24vs5\" (UID: \"5a890ce7-1c49-42c1-8158-a8bb6df28bca\") " pod="openshift-marketplace/redhat-marketplace-24vs5" Feb 17 14:01:51 crc kubenswrapper[4804]: I0217 14:01:51.028652 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a890ce7-1c49-42c1-8158-a8bb6df28bca-utilities\") pod \"redhat-marketplace-24vs5\" (UID: \"5a890ce7-1c49-42c1-8158-a8bb6df28bca\") " pod="openshift-marketplace/redhat-marketplace-24vs5" Feb 17 14:01:51 crc kubenswrapper[4804]: I0217 14:01:51.028817 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a890ce7-1c49-42c1-8158-a8bb6df28bca-catalog-content\") pod \"redhat-marketplace-24vs5\" (UID: \"5a890ce7-1c49-42c1-8158-a8bb6df28bca\") " pod="openshift-marketplace/redhat-marketplace-24vs5" Feb 17 14:01:51 crc kubenswrapper[4804]: I0217 14:01:51.029267 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a890ce7-1c49-42c1-8158-a8bb6df28bca-catalog-content\") pod \"redhat-marketplace-24vs5\" (UID: \"5a890ce7-1c49-42c1-8158-a8bb6df28bca\") " pod="openshift-marketplace/redhat-marketplace-24vs5" Feb 17 14:01:51 crc kubenswrapper[4804]: I0217 14:01:51.045820 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqxxq\" (UniqueName: \"kubernetes.io/projected/5a890ce7-1c49-42c1-8158-a8bb6df28bca-kube-api-access-vqxxq\") pod \"redhat-marketplace-24vs5\" (UID: \"5a890ce7-1c49-42c1-8158-a8bb6df28bca\") " pod="openshift-marketplace/redhat-marketplace-24vs5" Feb 17 14:01:51 crc kubenswrapper[4804]: I0217 14:01:51.190390 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-24vs5" Feb 17 14:01:51 crc kubenswrapper[4804]: I0217 14:01:51.836328 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-24vs5"] Feb 17 14:01:52 crc kubenswrapper[4804]: I0217 14:01:52.137707 4804 generic.go:334] "Generic (PLEG): container finished" podID="5a890ce7-1c49-42c1-8158-a8bb6df28bca" containerID="c37b6a1bfe6114a1583bf62c3293d53ff353116a92960724b6d89ab3ee891483" exitCode=0 Feb 17 14:01:52 crc kubenswrapper[4804]: I0217 14:01:52.137810 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-24vs5" event={"ID":"5a890ce7-1c49-42c1-8158-a8bb6df28bca","Type":"ContainerDied","Data":"c37b6a1bfe6114a1583bf62c3293d53ff353116a92960724b6d89ab3ee891483"} Feb 17 14:01:52 crc kubenswrapper[4804]: I0217 14:01:52.138113 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-24vs5" event={"ID":"5a890ce7-1c49-42c1-8158-a8bb6df28bca","Type":"ContainerStarted","Data":"83d3e06f4d72a2eb7be71704c6207c2d36fbeeedb06f9d779450bce36a4899aa"} Feb 17 14:01:53 crc kubenswrapper[4804]: I0217 14:01:53.148690 4804 generic.go:334] "Generic (PLEG): container finished" podID="5a890ce7-1c49-42c1-8158-a8bb6df28bca" containerID="974daf2e291d2c27e5ba0ebab79e8399e76718a9acab7f2630161a0ccc74d58a" exitCode=0 Feb 17 14:01:53 crc kubenswrapper[4804]: I0217 14:01:53.148764 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-24vs5" event={"ID":"5a890ce7-1c49-42c1-8158-a8bb6df28bca","Type":"ContainerDied","Data":"974daf2e291d2c27e5ba0ebab79e8399e76718a9acab7f2630161a0ccc74d58a"} Feb 17 14:01:54 crc kubenswrapper[4804]: I0217 14:01:54.158694 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-24vs5" event={"ID":"5a890ce7-1c49-42c1-8158-a8bb6df28bca","Type":"ContainerStarted","Data":"74d0926df80d0dce416caa4fc4d88ccf25e9777315a27e23b79ddfc84f3162c3"} Feb 17 14:01:54 crc kubenswrapper[4804]: I0217 14:01:54.176829 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-24vs5" podStartSLOduration=2.550049535 podStartE2EDuration="4.176793519s" podCreationTimestamp="2026-02-17 14:01:50 +0000 UTC" firstStartedPulling="2026-02-17 14:01:52.139408885 +0000 UTC m=+2186.250828222" lastFinishedPulling="2026-02-17 14:01:53.766152869 +0000 UTC m=+2187.877572206" observedRunningTime="2026-02-17 14:01:54.175809817 +0000 UTC m=+2188.287229154" watchObservedRunningTime="2026-02-17 14:01:54.176793519 +0000 UTC m=+2188.288212856" Feb 17 14:01:55 crc kubenswrapper[4804]: I0217 14:01:55.835007 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:01:55 crc kubenswrapper[4804]: I0217 14:01:55.835279 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:01:55 crc kubenswrapper[4804]: I0217 14:01:55.835321 4804 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 14:01:55 crc kubenswrapper[4804]: I0217 14:01:55.836038 4804 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8c6714459fe07a3c9e4e4659fffd6afce8b955eae8b1de9a8c8da55e663ec16f"} pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 14:01:55 crc kubenswrapper[4804]: I0217 14:01:55.836088 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" containerID="cri-o://8c6714459fe07a3c9e4e4659fffd6afce8b955eae8b1de9a8c8da55e663ec16f" gracePeriod=600 Feb 17 14:01:56 crc kubenswrapper[4804]: I0217 14:01:56.184002 4804 generic.go:334] "Generic (PLEG): container finished" podID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerID="8c6714459fe07a3c9e4e4659fffd6afce8b955eae8b1de9a8c8da55e663ec16f" exitCode=0 Feb 17 14:01:56 crc kubenswrapper[4804]: I0217 14:01:56.184335 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerDied","Data":"8c6714459fe07a3c9e4e4659fffd6afce8b955eae8b1de9a8c8da55e663ec16f"} Feb 17 14:01:56 crc kubenswrapper[4804]: I0217 14:01:56.184371 4804 scope.go:117] "RemoveContainer" containerID="174253766bd9ae14ae00956ce6f8a9280502616a9682343dd2a23341583ff687" Feb 17 14:01:57 crc kubenswrapper[4804]: I0217 14:01:57.194854 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerStarted","Data":"f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69"} Feb 17 14:02:01 crc kubenswrapper[4804]: I0217 14:02:01.191378 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-24vs5" Feb 17 14:02:01 crc kubenswrapper[4804]: I0217 14:02:01.191819 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-24vs5" Feb 17 14:02:01 crc kubenswrapper[4804]: I0217 14:02:01.244576 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-24vs5" Feb 17 14:02:01 crc kubenswrapper[4804]: I0217 14:02:01.298032 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-24vs5" Feb 17 14:02:01 crc kubenswrapper[4804]: I0217 14:02:01.502794 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-24vs5"] Feb 17 14:02:03 crc kubenswrapper[4804]: I0217 14:02:03.248969 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-24vs5" podUID="5a890ce7-1c49-42c1-8158-a8bb6df28bca" containerName="registry-server" containerID="cri-o://74d0926df80d0dce416caa4fc4d88ccf25e9777315a27e23b79ddfc84f3162c3" gracePeriod=2 Feb 17 14:02:03 crc kubenswrapper[4804]: I0217 14:02:03.755629 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-24vs5" Feb 17 14:02:03 crc kubenswrapper[4804]: I0217 14:02:03.900265 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqxxq\" (UniqueName: \"kubernetes.io/projected/5a890ce7-1c49-42c1-8158-a8bb6df28bca-kube-api-access-vqxxq\") pod \"5a890ce7-1c49-42c1-8158-a8bb6df28bca\" (UID: \"5a890ce7-1c49-42c1-8158-a8bb6df28bca\") " Feb 17 14:02:03 crc kubenswrapper[4804]: I0217 14:02:03.900403 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a890ce7-1c49-42c1-8158-a8bb6df28bca-catalog-content\") pod \"5a890ce7-1c49-42c1-8158-a8bb6df28bca\" (UID: \"5a890ce7-1c49-42c1-8158-a8bb6df28bca\") " Feb 17 14:02:03 crc kubenswrapper[4804]: I0217 14:02:03.900468 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a890ce7-1c49-42c1-8158-a8bb6df28bca-utilities\") pod \"5a890ce7-1c49-42c1-8158-a8bb6df28bca\" (UID: \"5a890ce7-1c49-42c1-8158-a8bb6df28bca\") " Feb 17 14:02:03 crc kubenswrapper[4804]: I0217 14:02:03.901551 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a890ce7-1c49-42c1-8158-a8bb6df28bca-utilities" (OuterVolumeSpecName: "utilities") pod "5a890ce7-1c49-42c1-8158-a8bb6df28bca" (UID: "5a890ce7-1c49-42c1-8158-a8bb6df28bca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:02:03 crc kubenswrapper[4804]: I0217 14:02:03.905840 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a890ce7-1c49-42c1-8158-a8bb6df28bca-kube-api-access-vqxxq" (OuterVolumeSpecName: "kube-api-access-vqxxq") pod "5a890ce7-1c49-42c1-8158-a8bb6df28bca" (UID: "5a890ce7-1c49-42c1-8158-a8bb6df28bca"). InnerVolumeSpecName "kube-api-access-vqxxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:02:03 crc kubenswrapper[4804]: I0217 14:02:03.928794 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a890ce7-1c49-42c1-8158-a8bb6df28bca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a890ce7-1c49-42c1-8158-a8bb6df28bca" (UID: "5a890ce7-1c49-42c1-8158-a8bb6df28bca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:02:04 crc kubenswrapper[4804]: I0217 14:02:04.002347 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqxxq\" (UniqueName: \"kubernetes.io/projected/5a890ce7-1c49-42c1-8158-a8bb6df28bca-kube-api-access-vqxxq\") on node \"crc\" DevicePath \"\"" Feb 17 14:02:04 crc kubenswrapper[4804]: I0217 14:02:04.002383 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a890ce7-1c49-42c1-8158-a8bb6df28bca-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:02:04 crc kubenswrapper[4804]: I0217 14:02:04.002393 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a890ce7-1c49-42c1-8158-a8bb6df28bca-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:02:04 crc kubenswrapper[4804]: I0217 14:02:04.260225 4804 generic.go:334] "Generic (PLEG): container finished" podID="5a890ce7-1c49-42c1-8158-a8bb6df28bca" containerID="74d0926df80d0dce416caa4fc4d88ccf25e9777315a27e23b79ddfc84f3162c3" exitCode=0 Feb 17 14:02:04 crc kubenswrapper[4804]: I0217 14:02:04.260278 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-24vs5" Feb 17 14:02:04 crc kubenswrapper[4804]: I0217 14:02:04.260298 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-24vs5" event={"ID":"5a890ce7-1c49-42c1-8158-a8bb6df28bca","Type":"ContainerDied","Data":"74d0926df80d0dce416caa4fc4d88ccf25e9777315a27e23b79ddfc84f3162c3"} Feb 17 14:02:04 crc kubenswrapper[4804]: I0217 14:02:04.260770 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-24vs5" event={"ID":"5a890ce7-1c49-42c1-8158-a8bb6df28bca","Type":"ContainerDied","Data":"83d3e06f4d72a2eb7be71704c6207c2d36fbeeedb06f9d779450bce36a4899aa"} Feb 17 14:02:04 crc kubenswrapper[4804]: I0217 14:02:04.260812 4804 scope.go:117] "RemoveContainer" containerID="74d0926df80d0dce416caa4fc4d88ccf25e9777315a27e23b79ddfc84f3162c3" Feb 17 14:02:04 crc kubenswrapper[4804]: I0217 14:02:04.297432 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-24vs5"] Feb 17 14:02:04 crc kubenswrapper[4804]: I0217 14:02:04.309439 4804 scope.go:117] "RemoveContainer" containerID="974daf2e291d2c27e5ba0ebab79e8399e76718a9acab7f2630161a0ccc74d58a" Feb 17 14:02:04 crc kubenswrapper[4804]: I0217 14:02:04.310794 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-24vs5"] Feb 17 14:02:04 crc kubenswrapper[4804]: I0217 14:02:04.327693 4804 scope.go:117] "RemoveContainer" containerID="c37b6a1bfe6114a1583bf62c3293d53ff353116a92960724b6d89ab3ee891483" Feb 17 14:02:04 crc kubenswrapper[4804]: I0217 14:02:04.370626 4804 scope.go:117] "RemoveContainer" containerID="74d0926df80d0dce416caa4fc4d88ccf25e9777315a27e23b79ddfc84f3162c3" Feb 17 14:02:04 crc kubenswrapper[4804]: E0217 14:02:04.371130 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74d0926df80d0dce416caa4fc4d88ccf25e9777315a27e23b79ddfc84f3162c3\": container with ID starting with 74d0926df80d0dce416caa4fc4d88ccf25e9777315a27e23b79ddfc84f3162c3 not found: ID does not exist" containerID="74d0926df80d0dce416caa4fc4d88ccf25e9777315a27e23b79ddfc84f3162c3" Feb 17 14:02:04 crc kubenswrapper[4804]: I0217 14:02:04.371168 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74d0926df80d0dce416caa4fc4d88ccf25e9777315a27e23b79ddfc84f3162c3"} err="failed to get container status \"74d0926df80d0dce416caa4fc4d88ccf25e9777315a27e23b79ddfc84f3162c3\": rpc error: code = NotFound desc = could not find container \"74d0926df80d0dce416caa4fc4d88ccf25e9777315a27e23b79ddfc84f3162c3\": container with ID starting with 74d0926df80d0dce416caa4fc4d88ccf25e9777315a27e23b79ddfc84f3162c3 not found: ID does not exist" Feb 17 14:02:04 crc kubenswrapper[4804]: I0217 14:02:04.371238 4804 scope.go:117] "RemoveContainer" containerID="974daf2e291d2c27e5ba0ebab79e8399e76718a9acab7f2630161a0ccc74d58a" Feb 17 14:02:04 crc kubenswrapper[4804]: E0217 14:02:04.371672 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"974daf2e291d2c27e5ba0ebab79e8399e76718a9acab7f2630161a0ccc74d58a\": container with ID starting with 974daf2e291d2c27e5ba0ebab79e8399e76718a9acab7f2630161a0ccc74d58a not found: ID does not exist" containerID="974daf2e291d2c27e5ba0ebab79e8399e76718a9acab7f2630161a0ccc74d58a" Feb 17 14:02:04 crc kubenswrapper[4804]: I0217 14:02:04.371790 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"974daf2e291d2c27e5ba0ebab79e8399e76718a9acab7f2630161a0ccc74d58a"} err="failed to get container status \"974daf2e291d2c27e5ba0ebab79e8399e76718a9acab7f2630161a0ccc74d58a\": rpc error: code = NotFound desc = could not find container \"974daf2e291d2c27e5ba0ebab79e8399e76718a9acab7f2630161a0ccc74d58a\": container with ID starting with 974daf2e291d2c27e5ba0ebab79e8399e76718a9acab7f2630161a0ccc74d58a not found: ID does not exist" Feb 17 14:02:04 crc kubenswrapper[4804]: I0217 14:02:04.371879 4804 scope.go:117] "RemoveContainer" containerID="c37b6a1bfe6114a1583bf62c3293d53ff353116a92960724b6d89ab3ee891483" Feb 17 14:02:04 crc kubenswrapper[4804]: E0217 14:02:04.372252 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c37b6a1bfe6114a1583bf62c3293d53ff353116a92960724b6d89ab3ee891483\": container with ID starting with c37b6a1bfe6114a1583bf62c3293d53ff353116a92960724b6d89ab3ee891483 not found: ID does not exist" containerID="c37b6a1bfe6114a1583bf62c3293d53ff353116a92960724b6d89ab3ee891483" Feb 17 14:02:04 crc kubenswrapper[4804]: I0217 14:02:04.372348 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c37b6a1bfe6114a1583bf62c3293d53ff353116a92960724b6d89ab3ee891483"} err="failed to get container status \"c37b6a1bfe6114a1583bf62c3293d53ff353116a92960724b6d89ab3ee891483\": rpc error: code = NotFound desc = could not find container \"c37b6a1bfe6114a1583bf62c3293d53ff353116a92960724b6d89ab3ee891483\": container with ID starting with c37b6a1bfe6114a1583bf62c3293d53ff353116a92960724b6d89ab3ee891483 not found: ID does not exist" Feb 17 14:02:04 crc kubenswrapper[4804]: I0217 14:02:04.584762 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a890ce7-1c49-42c1-8158-a8bb6df28bca" path="/var/lib/kubelet/pods/5a890ce7-1c49-42c1-8158-a8bb6df28bca/volumes" Feb 17 14:03:01 crc kubenswrapper[4804]: I0217 14:03:01.508057 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8p6ls"] Feb 17 14:03:01 crc kubenswrapper[4804]: E0217 14:03:01.511120 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a890ce7-1c49-42c1-8158-a8bb6df28bca" containerName="extract-content" Feb 17 14:03:01 crc kubenswrapper[4804]: I0217 14:03:01.511157 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a890ce7-1c49-42c1-8158-a8bb6df28bca" containerName="extract-content" Feb 17 14:03:01 crc kubenswrapper[4804]: E0217 14:03:01.511252 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a890ce7-1c49-42c1-8158-a8bb6df28bca" containerName="extract-utilities" Feb 17 14:03:01 crc kubenswrapper[4804]: I0217 14:03:01.511264 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a890ce7-1c49-42c1-8158-a8bb6df28bca" containerName="extract-utilities" Feb 17 14:03:01 crc kubenswrapper[4804]: E0217 14:03:01.511281 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a890ce7-1c49-42c1-8158-a8bb6df28bca" containerName="registry-server" Feb 17 14:03:01 crc kubenswrapper[4804]: I0217 14:03:01.511291 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a890ce7-1c49-42c1-8158-a8bb6df28bca" containerName="registry-server" Feb 17 14:03:01 crc kubenswrapper[4804]: I0217 14:03:01.511659 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a890ce7-1c49-42c1-8158-a8bb6df28bca" containerName="registry-server" Feb 17 14:03:01 crc kubenswrapper[4804]: I0217 14:03:01.513846 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8p6ls" Feb 17 14:03:01 crc kubenswrapper[4804]: I0217 14:03:01.528125 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8p6ls"] Feb 17 14:03:01 crc kubenswrapper[4804]: I0217 14:03:01.612237 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0901d547-00b8-45f5-b76c-d3a87cf88ee3-utilities\") pod \"community-operators-8p6ls\" (UID: \"0901d547-00b8-45f5-b76c-d3a87cf88ee3\") " pod="openshift-marketplace/community-operators-8p6ls" Feb 17 14:03:01 crc kubenswrapper[4804]: I0217 14:03:01.612341 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqmtm\" (UniqueName: \"kubernetes.io/projected/0901d547-00b8-45f5-b76c-d3a87cf88ee3-kube-api-access-tqmtm\") pod \"community-operators-8p6ls\" (UID: \"0901d547-00b8-45f5-b76c-d3a87cf88ee3\") " pod="openshift-marketplace/community-operators-8p6ls" Feb 17 14:03:01 crc kubenswrapper[4804]: I0217 14:03:01.612398 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0901d547-00b8-45f5-b76c-d3a87cf88ee3-catalog-content\") pod \"community-operators-8p6ls\" (UID: \"0901d547-00b8-45f5-b76c-d3a87cf88ee3\") " pod="openshift-marketplace/community-operators-8p6ls" Feb 17 14:03:01 crc kubenswrapper[4804]: I0217 14:03:01.713815 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0901d547-00b8-45f5-b76c-d3a87cf88ee3-utilities\") pod \"community-operators-8p6ls\" (UID: \"0901d547-00b8-45f5-b76c-d3a87cf88ee3\") " pod="openshift-marketplace/community-operators-8p6ls" Feb 17 14:03:01 crc kubenswrapper[4804]: I0217 14:03:01.714339 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0901d547-00b8-45f5-b76c-d3a87cf88ee3-utilities\") pod \"community-operators-8p6ls\" (UID: \"0901d547-00b8-45f5-b76c-d3a87cf88ee3\") " pod="openshift-marketplace/community-operators-8p6ls" Feb 17 14:03:01 crc kubenswrapper[4804]: I0217 14:03:01.714608 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqmtm\" (UniqueName: \"kubernetes.io/projected/0901d547-00b8-45f5-b76c-d3a87cf88ee3-kube-api-access-tqmtm\") pod \"community-operators-8p6ls\" (UID: \"0901d547-00b8-45f5-b76c-d3a87cf88ee3\") " pod="openshift-marketplace/community-operators-8p6ls" Feb 17 14:03:01 crc kubenswrapper[4804]: I0217 14:03:01.714944 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0901d547-00b8-45f5-b76c-d3a87cf88ee3-catalog-content\") pod \"community-operators-8p6ls\" (UID: \"0901d547-00b8-45f5-b76c-d3a87cf88ee3\") " pod="openshift-marketplace/community-operators-8p6ls" Feb 17 14:03:01 crc kubenswrapper[4804]: I0217 14:03:01.715326 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0901d547-00b8-45f5-b76c-d3a87cf88ee3-catalog-content\") pod \"community-operators-8p6ls\" (UID: \"0901d547-00b8-45f5-b76c-d3a87cf88ee3\") " pod="openshift-marketplace/community-operators-8p6ls" Feb 17 14:03:01 crc kubenswrapper[4804]: I0217 14:03:01.742265 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqmtm\" (UniqueName: \"kubernetes.io/projected/0901d547-00b8-45f5-b76c-d3a87cf88ee3-kube-api-access-tqmtm\") pod \"community-operators-8p6ls\" (UID: \"0901d547-00b8-45f5-b76c-d3a87cf88ee3\") " pod="openshift-marketplace/community-operators-8p6ls" Feb 17 14:03:01 crc kubenswrapper[4804]: I0217 14:03:01.835058 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8p6ls" Feb 17 14:03:02 crc kubenswrapper[4804]: I0217 14:03:02.366350 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8p6ls"] Feb 17 14:03:03 crc kubenswrapper[4804]: I0217 14:03:03.201545 4804 generic.go:334] "Generic (PLEG): container finished" podID="0901d547-00b8-45f5-b76c-d3a87cf88ee3" containerID="cb7e2b37355090860e3e9a91b1464c77c4023b6564b4a1bf88043aa221cc5977" exitCode=0 Feb 17 14:03:03 crc kubenswrapper[4804]: I0217 14:03:03.201908 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p6ls" event={"ID":"0901d547-00b8-45f5-b76c-d3a87cf88ee3","Type":"ContainerDied","Data":"cb7e2b37355090860e3e9a91b1464c77c4023b6564b4a1bf88043aa221cc5977"} Feb 17 14:03:03 crc kubenswrapper[4804]: I0217 14:03:03.202134 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p6ls" event={"ID":"0901d547-00b8-45f5-b76c-d3a87cf88ee3","Type":"ContainerStarted","Data":"675f3b5cd0c50bc790b7d5e0a3f30a92a499a3872cebb2edb676d0e9e0b963ee"} Feb 17 14:03:04 crc kubenswrapper[4804]: I0217 14:03:04.213470 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p6ls" event={"ID":"0901d547-00b8-45f5-b76c-d3a87cf88ee3","Type":"ContainerStarted","Data":"eff0676e249c837cad290a41965855fe033468df9fcee33750d390ff5ccfa6fe"} Feb 17 14:03:05 crc kubenswrapper[4804]: I0217 14:03:05.225038 4804 generic.go:334] "Generic (PLEG): container finished" podID="0901d547-00b8-45f5-b76c-d3a87cf88ee3" containerID="eff0676e249c837cad290a41965855fe033468df9fcee33750d390ff5ccfa6fe" exitCode=0 Feb 17 14:03:05 crc kubenswrapper[4804]: I0217 14:03:05.225091 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p6ls" event={"ID":"0901d547-00b8-45f5-b76c-d3a87cf88ee3","Type":"ContainerDied","Data":"eff0676e249c837cad290a41965855fe033468df9fcee33750d390ff5ccfa6fe"} Feb 17 14:03:06 crc kubenswrapper[4804]: I0217 14:03:06.239398 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p6ls" event={"ID":"0901d547-00b8-45f5-b76c-d3a87cf88ee3","Type":"ContainerStarted","Data":"b53cd4c822f6c8eba8cd6a3b1cfd0e738f6a63c14a1c24af928fd310ec81f04f"} Feb 17 14:03:06 crc kubenswrapper[4804]: I0217 14:03:06.267575 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8p6ls" podStartSLOduration=2.861565206 podStartE2EDuration="5.267534833s" podCreationTimestamp="2026-02-17 14:03:01 +0000 UTC" firstStartedPulling="2026-02-17 14:03:03.205183912 +0000 UTC m=+2257.316603289" lastFinishedPulling="2026-02-17 14:03:05.611153569 +0000 UTC m=+2259.722572916" observedRunningTime="2026-02-17 14:03:06.258809161 +0000 UTC m=+2260.370228508" watchObservedRunningTime="2026-02-17 14:03:06.267534833 +0000 UTC m=+2260.378954160" Feb 17 14:03:11 crc kubenswrapper[4804]: I0217 14:03:11.835485 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8p6ls" Feb 17 14:03:11 crc kubenswrapper[4804]: I0217 14:03:11.836081 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8p6ls" Feb 17 14:03:11 crc kubenswrapper[4804]: I0217 14:03:11.888051 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8p6ls" Feb 17 14:03:12 crc kubenswrapper[4804]: I0217 14:03:12.366605 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8p6ls" Feb 17 14:03:12 crc kubenswrapper[4804]: I0217 14:03:12.423052 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8p6ls"] Feb 17 14:03:14 crc kubenswrapper[4804]: I0217 14:03:14.314423 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8p6ls" podUID="0901d547-00b8-45f5-b76c-d3a87cf88ee3" containerName="registry-server" containerID="cri-o://b53cd4c822f6c8eba8cd6a3b1cfd0e738f6a63c14a1c24af928fd310ec81f04f" gracePeriod=2 Feb 17 14:03:14 crc kubenswrapper[4804]: I0217 14:03:14.783977 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8p6ls" Feb 17 14:03:14 crc kubenswrapper[4804]: I0217 14:03:14.868157 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0901d547-00b8-45f5-b76c-d3a87cf88ee3-catalog-content\") pod \"0901d547-00b8-45f5-b76c-d3a87cf88ee3\" (UID: \"0901d547-00b8-45f5-b76c-d3a87cf88ee3\") " Feb 17 14:03:14 crc kubenswrapper[4804]: I0217 14:03:14.868299 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqmtm\" (UniqueName: \"kubernetes.io/projected/0901d547-00b8-45f5-b76c-d3a87cf88ee3-kube-api-access-tqmtm\") pod \"0901d547-00b8-45f5-b76c-d3a87cf88ee3\" (UID: \"0901d547-00b8-45f5-b76c-d3a87cf88ee3\") " Feb 17 14:03:14 crc kubenswrapper[4804]: I0217 14:03:14.868626 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0901d547-00b8-45f5-b76c-d3a87cf88ee3-utilities\") pod \"0901d547-00b8-45f5-b76c-d3a87cf88ee3\" (UID: \"0901d547-00b8-45f5-b76c-d3a87cf88ee3\") " Feb 17 14:03:14 crc kubenswrapper[4804]: I0217 14:03:14.869432 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0901d547-00b8-45f5-b76c-d3a87cf88ee3-utilities" (OuterVolumeSpecName: "utilities") pod "0901d547-00b8-45f5-b76c-d3a87cf88ee3" (UID: "0901d547-00b8-45f5-b76c-d3a87cf88ee3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:03:14 crc kubenswrapper[4804]: I0217 14:03:14.874607 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0901d547-00b8-45f5-b76c-d3a87cf88ee3-kube-api-access-tqmtm" (OuterVolumeSpecName: "kube-api-access-tqmtm") pod "0901d547-00b8-45f5-b76c-d3a87cf88ee3" (UID: "0901d547-00b8-45f5-b76c-d3a87cf88ee3"). InnerVolumeSpecName "kube-api-access-tqmtm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:03:14 crc kubenswrapper[4804]: I0217 14:03:14.930142 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0901d547-00b8-45f5-b76c-d3a87cf88ee3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0901d547-00b8-45f5-b76c-d3a87cf88ee3" (UID: "0901d547-00b8-45f5-b76c-d3a87cf88ee3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:03:14 crc kubenswrapper[4804]: I0217 14:03:14.971520 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0901d547-00b8-45f5-b76c-d3a87cf88ee3-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:03:14 crc kubenswrapper[4804]: I0217 14:03:14.971559 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0901d547-00b8-45f5-b76c-d3a87cf88ee3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:03:14 crc kubenswrapper[4804]: I0217 14:03:14.971571 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqmtm\" (UniqueName: \"kubernetes.io/projected/0901d547-00b8-45f5-b76c-d3a87cf88ee3-kube-api-access-tqmtm\") on node \"crc\" DevicePath \"\"" Feb 17 14:03:15 crc kubenswrapper[4804]: I0217 14:03:15.328916 4804 generic.go:334] "Generic (PLEG): container finished" podID="0901d547-00b8-45f5-b76c-d3a87cf88ee3" containerID="b53cd4c822f6c8eba8cd6a3b1cfd0e738f6a63c14a1c24af928fd310ec81f04f" exitCode=0 Feb 17 14:03:15 crc kubenswrapper[4804]: I0217 14:03:15.328984 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p6ls" event={"ID":"0901d547-00b8-45f5-b76c-d3a87cf88ee3","Type":"ContainerDied","Data":"b53cd4c822f6c8eba8cd6a3b1cfd0e738f6a63c14a1c24af928fd310ec81f04f"} Feb 17 14:03:15 crc kubenswrapper[4804]: I0217 14:03:15.329024 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p6ls" event={"ID":"0901d547-00b8-45f5-b76c-d3a87cf88ee3","Type":"ContainerDied","Data":"675f3b5cd0c50bc790b7d5e0a3f30a92a499a3872cebb2edb676d0e9e0b963ee"} Feb 17 14:03:15 crc kubenswrapper[4804]: I0217 14:03:15.329058 4804 scope.go:117] "RemoveContainer" containerID="b53cd4c822f6c8eba8cd6a3b1cfd0e738f6a63c14a1c24af928fd310ec81f04f" Feb 17 14:03:15 crc kubenswrapper[4804]: I0217 14:03:15.329304 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8p6ls" Feb 17 14:03:15 crc kubenswrapper[4804]: I0217 14:03:15.375407 4804 scope.go:117] "RemoveContainer" containerID="eff0676e249c837cad290a41965855fe033468df9fcee33750d390ff5ccfa6fe" Feb 17 14:03:15 crc kubenswrapper[4804]: I0217 14:03:15.381862 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8p6ls"] Feb 17 14:03:15 crc kubenswrapper[4804]: I0217 14:03:15.390994 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8p6ls"] Feb 17 14:03:15 crc kubenswrapper[4804]: I0217 14:03:15.411452 4804 scope.go:117] "RemoveContainer" containerID="cb7e2b37355090860e3e9a91b1464c77c4023b6564b4a1bf88043aa221cc5977" Feb 17 14:03:15 crc kubenswrapper[4804]: I0217 14:03:15.439079 4804 scope.go:117] "RemoveContainer" containerID="b53cd4c822f6c8eba8cd6a3b1cfd0e738f6a63c14a1c24af928fd310ec81f04f" Feb 17 14:03:15 crc kubenswrapper[4804]: E0217 14:03:15.439685 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b53cd4c822f6c8eba8cd6a3b1cfd0e738f6a63c14a1c24af928fd310ec81f04f\": container with ID starting with b53cd4c822f6c8eba8cd6a3b1cfd0e738f6a63c14a1c24af928fd310ec81f04f not found: ID does not exist" containerID="b53cd4c822f6c8eba8cd6a3b1cfd0e738f6a63c14a1c24af928fd310ec81f04f" Feb 17 14:03:15 crc kubenswrapper[4804]: I0217 14:03:15.439733 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b53cd4c822f6c8eba8cd6a3b1cfd0e738f6a63c14a1c24af928fd310ec81f04f"} err="failed to get container status \"b53cd4c822f6c8eba8cd6a3b1cfd0e738f6a63c14a1c24af928fd310ec81f04f\": rpc error: code = NotFound desc = could not find container \"b53cd4c822f6c8eba8cd6a3b1cfd0e738f6a63c14a1c24af928fd310ec81f04f\": container with ID starting with b53cd4c822f6c8eba8cd6a3b1cfd0e738f6a63c14a1c24af928fd310ec81f04f not found: ID does not exist" Feb 17 14:03:15 crc kubenswrapper[4804]: I0217 14:03:15.439764 4804 scope.go:117] "RemoveContainer" containerID="eff0676e249c837cad290a41965855fe033468df9fcee33750d390ff5ccfa6fe" Feb 17 14:03:15 crc kubenswrapper[4804]: E0217 14:03:15.440249 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eff0676e249c837cad290a41965855fe033468df9fcee33750d390ff5ccfa6fe\": container with ID starting with eff0676e249c837cad290a41965855fe033468df9fcee33750d390ff5ccfa6fe not found: ID does not exist" containerID="eff0676e249c837cad290a41965855fe033468df9fcee33750d390ff5ccfa6fe" Feb 17 14:03:15 crc kubenswrapper[4804]: I0217 14:03:15.440362 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eff0676e249c837cad290a41965855fe033468df9fcee33750d390ff5ccfa6fe"} err="failed to get container status \"eff0676e249c837cad290a41965855fe033468df9fcee33750d390ff5ccfa6fe\": rpc error: code = NotFound desc = could not find container \"eff0676e249c837cad290a41965855fe033468df9fcee33750d390ff5ccfa6fe\": container with ID starting with eff0676e249c837cad290a41965855fe033468df9fcee33750d390ff5ccfa6fe not found: ID does not exist" Feb 17 14:03:15 crc kubenswrapper[4804]: I0217 14:03:15.440458 4804 scope.go:117] "RemoveContainer" containerID="cb7e2b37355090860e3e9a91b1464c77c4023b6564b4a1bf88043aa221cc5977" Feb 17 14:03:15 crc kubenswrapper[4804]: E0217 14:03:15.440798 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb7e2b37355090860e3e9a91b1464c77c4023b6564b4a1bf88043aa221cc5977\": container with ID starting with cb7e2b37355090860e3e9a91b1464c77c4023b6564b4a1bf88043aa221cc5977 not found: ID does not exist" containerID="cb7e2b37355090860e3e9a91b1464c77c4023b6564b4a1bf88043aa221cc5977" Feb 17 14:03:15 crc kubenswrapper[4804]: I0217 14:03:15.440839 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb7e2b37355090860e3e9a91b1464c77c4023b6564b4a1bf88043aa221cc5977"} err="failed to get container status \"cb7e2b37355090860e3e9a91b1464c77c4023b6564b4a1bf88043aa221cc5977\": rpc error: code = NotFound desc = could not find container \"cb7e2b37355090860e3e9a91b1464c77c4023b6564b4a1bf88043aa221cc5977\": container with ID starting with cb7e2b37355090860e3e9a91b1464c77c4023b6564b4a1bf88043aa221cc5977 not found: ID does not exist" Feb 17 14:03:16 crc kubenswrapper[4804]: I0217 14:03:16.591820 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0901d547-00b8-45f5-b76c-d3a87cf88ee3" path="/var/lib/kubelet/pods/0901d547-00b8-45f5-b76c-d3a87cf88ee3/volumes" Feb 17 14:04:25 crc kubenswrapper[4804]: I0217 14:04:25.836058 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:04:25 crc kubenswrapper[4804]: I0217 14:04:25.836714 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:04:55 crc kubenswrapper[4804]: I0217 14:04:55.835476 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:04:55 crc kubenswrapper[4804]: I0217 14:04:55.837448 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:05:06 crc kubenswrapper[4804]: I0217 14:05:06.446981 4804 generic.go:334] "Generic (PLEG): container finished" podID="c0aad2ba-98cf-42b5-9c03-40633fb8ac18" containerID="8ce565034da923c62aa35b8a82d937d994fde79d28a308a124ad2648ce45eeca" exitCode=0 Feb 17 14:05:06 crc kubenswrapper[4804]: I0217 14:05:06.447070 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" event={"ID":"c0aad2ba-98cf-42b5-9c03-40633fb8ac18","Type":"ContainerDied","Data":"8ce565034da923c62aa35b8a82d937d994fde79d28a308a124ad2648ce45eeca"} Feb 17 14:05:07 crc kubenswrapper[4804]: I0217 14:05:07.855335 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" Feb 17 14:05:07 crc kubenswrapper[4804]: I0217 14:05:07.990236 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-ssh-key-openstack-edpm-ipam\") pod \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\" (UID: \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\") " Feb 17 14:05:07 crc kubenswrapper[4804]: I0217 14:05:07.990302 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-libvirt-secret-0\") pod \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\" (UID: \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\") " Feb 17 14:05:07 crc kubenswrapper[4804]: I0217 14:05:07.990373 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-libvirt-combined-ca-bundle\") pod \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\" (UID: \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\") " Feb 17 14:05:07 crc kubenswrapper[4804]: I0217 14:05:07.990405 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bzfk\" (UniqueName: \"kubernetes.io/projected/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-kube-api-access-6bzfk\") pod \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\" (UID: \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\") " Feb 17 14:05:07 crc kubenswrapper[4804]: I0217 14:05:07.990475 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-inventory\") pod \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\" (UID: \"c0aad2ba-98cf-42b5-9c03-40633fb8ac18\") " Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.002783 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "c0aad2ba-98cf-42b5-9c03-40633fb8ac18" (UID: "c0aad2ba-98cf-42b5-9c03-40633fb8ac18"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.005245 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-kube-api-access-6bzfk" (OuterVolumeSpecName: "kube-api-access-6bzfk") pod "c0aad2ba-98cf-42b5-9c03-40633fb8ac18" (UID: "c0aad2ba-98cf-42b5-9c03-40633fb8ac18"). InnerVolumeSpecName "kube-api-access-6bzfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.026860 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c0aad2ba-98cf-42b5-9c03-40633fb8ac18" (UID: "c0aad2ba-98cf-42b5-9c03-40633fb8ac18"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.029622 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "c0aad2ba-98cf-42b5-9c03-40633fb8ac18" (UID: "c0aad2ba-98cf-42b5-9c03-40633fb8ac18"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.034387 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-inventory" (OuterVolumeSpecName: "inventory") pod "c0aad2ba-98cf-42b5-9c03-40633fb8ac18" (UID: "c0aad2ba-98cf-42b5-9c03-40633fb8ac18"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.092743 4804 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.092943 4804 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.093030 4804 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.093096 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bzfk\" (UniqueName: \"kubernetes.io/projected/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-kube-api-access-6bzfk\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.093154 4804 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0aad2ba-98cf-42b5-9c03-40633fb8ac18-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.465518 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" event={"ID":"c0aad2ba-98cf-42b5-9c03-40633fb8ac18","Type":"ContainerDied","Data":"b85edd5f6c172e3fc7186590e69abc45c66e58877217d9c682e3a1e6773d16ec"} Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.465555 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b85edd5f6c172e3fc7186590e69abc45c66e58877217d9c682e3a1e6773d16ec" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.465622 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.572095 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml"] Feb 17 14:05:08 crc kubenswrapper[4804]: E0217 14:05:08.572496 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0901d547-00b8-45f5-b76c-d3a87cf88ee3" containerName="extract-content" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.572511 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="0901d547-00b8-45f5-b76c-d3a87cf88ee3" containerName="extract-content" Feb 17 14:05:08 crc kubenswrapper[4804]: E0217 14:05:08.572530 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0901d547-00b8-45f5-b76c-d3a87cf88ee3" containerName="registry-server" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.572536 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="0901d547-00b8-45f5-b76c-d3a87cf88ee3" containerName="registry-server" Feb 17 14:05:08 crc kubenswrapper[4804]: E0217 14:05:08.572553 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0901d547-00b8-45f5-b76c-d3a87cf88ee3" containerName="extract-utilities" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.572560 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="0901d547-00b8-45f5-b76c-d3a87cf88ee3" containerName="extract-utilities" Feb 17 14:05:08 crc kubenswrapper[4804]: E0217 14:05:08.572578 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0aad2ba-98cf-42b5-9c03-40633fb8ac18" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.572585 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0aad2ba-98cf-42b5-9c03-40633fb8ac18" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.572750 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0aad2ba-98cf-42b5-9c03-40633fb8ac18" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.572758 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="0901d547-00b8-45f5-b76c-d3a87cf88ee3" containerName="registry-server" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.578933 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.584482 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.584496 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.584844 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.585050 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.585369 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.585539 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-29gqz" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.592584 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml"] Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.592776 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.704357 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv28f\" (UniqueName: \"kubernetes.io/projected/9f17dd92-0402-40c7-bdc7-50b38e37f750-kube-api-access-gv28f\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.704434 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.704462 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.704596 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.704649 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.704683 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.704752 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.704933 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.704994 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.806751 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.806860 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gv28f\" (UniqueName: \"kubernetes.io/projected/9f17dd92-0402-40c7-bdc7-50b38e37f750-kube-api-access-gv28f\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.806909 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.806935 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.807027 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.807061 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.807183 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.807668 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.807714 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.807987 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.811972 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.812026 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.812099 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.812179 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.812379 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.816646 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.818355 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.822884 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv28f\" (UniqueName: \"kubernetes.io/projected/9f17dd92-0402-40c7-bdc7-50b38e37f750-kube-api-access-gv28f\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x8lml\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:08 crc kubenswrapper[4804]: I0217 14:05:08.911243 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:05:09 crc kubenswrapper[4804]: I0217 14:05:09.451025 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml"] Feb 17 14:05:09 crc kubenswrapper[4804]: I0217 14:05:09.476580 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" event={"ID":"9f17dd92-0402-40c7-bdc7-50b38e37f750","Type":"ContainerStarted","Data":"3737f6f0131a3f5e82616cb8a9012b910af27541cdeac5c9048e6ea1b4d2299d"} Feb 17 14:05:10 crc kubenswrapper[4804]: I0217 14:05:10.486469 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" event={"ID":"9f17dd92-0402-40c7-bdc7-50b38e37f750","Type":"ContainerStarted","Data":"dbda66ccca14c400ac04b20a535081d1e040f266a1132798c4aceb72485b84fa"} Feb 17 14:05:10 crc kubenswrapper[4804]: I0217 14:05:10.509324 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" podStartSLOduration=2.346156847 podStartE2EDuration="2.509300384s" podCreationTimestamp="2026-02-17 14:05:08 +0000 UTC" firstStartedPulling="2026-02-17 14:05:09.467926161 +0000 UTC m=+2383.579345498" lastFinishedPulling="2026-02-17 14:05:09.631069708 +0000 UTC m=+2383.742489035" observedRunningTime="2026-02-17 14:05:10.504495584 +0000 UTC m=+2384.615914931" watchObservedRunningTime="2026-02-17 14:05:10.509300384 +0000 UTC m=+2384.620719721" Feb 17 14:05:25 crc kubenswrapper[4804]: I0217 14:05:25.836091 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:05:25 crc kubenswrapper[4804]: I0217 14:05:25.837229 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:05:25 crc kubenswrapper[4804]: I0217 14:05:25.837303 4804 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 14:05:25 crc kubenswrapper[4804]: I0217 14:05:25.838570 4804 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69"} pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 14:05:25 crc kubenswrapper[4804]: I0217 14:05:25.838713 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" containerID="cri-o://f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" gracePeriod=600 Feb 17 14:05:25 crc kubenswrapper[4804]: E0217 14:05:25.961435 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:05:26 crc kubenswrapper[4804]: I0217 14:05:26.623112 4804 generic.go:334] "Generic (PLEG): container finished" podID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" exitCode=0 Feb 17 14:05:26 crc kubenswrapper[4804]: I0217 14:05:26.623460 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerDied","Data":"f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69"} Feb 17 14:05:26 crc kubenswrapper[4804]: I0217 14:05:26.623492 4804 scope.go:117] "RemoveContainer" containerID="8c6714459fe07a3c9e4e4659fffd6afce8b955eae8b1de9a8c8da55e663ec16f" Feb 17 14:05:26 crc kubenswrapper[4804]: I0217 14:05:26.624072 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:05:26 crc kubenswrapper[4804]: E0217 14:05:26.624412 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:05:29 crc kubenswrapper[4804]: I0217 14:05:29.278620 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2czqf"] Feb 17 14:05:29 crc kubenswrapper[4804]: I0217 14:05:29.280927 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2czqf" Feb 17 14:05:29 crc kubenswrapper[4804]: I0217 14:05:29.354737 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2czqf"] Feb 17 14:05:29 crc kubenswrapper[4804]: I0217 14:05:29.449081 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d76421b-4776-498f-a065-58f55d0e6e19-catalog-content\") pod \"certified-operators-2czqf\" (UID: \"4d76421b-4776-498f-a065-58f55d0e6e19\") " pod="openshift-marketplace/certified-operators-2czqf" Feb 17 14:05:29 crc kubenswrapper[4804]: I0217 14:05:29.449248 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d76421b-4776-498f-a065-58f55d0e6e19-utilities\") pod \"certified-operators-2czqf\" (UID: \"4d76421b-4776-498f-a065-58f55d0e6e19\") " pod="openshift-marketplace/certified-operators-2czqf" Feb 17 14:05:29 crc kubenswrapper[4804]: I0217 14:05:29.449300 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29lf6\" (UniqueName: \"kubernetes.io/projected/4d76421b-4776-498f-a065-58f55d0e6e19-kube-api-access-29lf6\") pod \"certified-operators-2czqf\" (UID: \"4d76421b-4776-498f-a065-58f55d0e6e19\") " pod="openshift-marketplace/certified-operators-2czqf" Feb 17 14:05:29 crc kubenswrapper[4804]: I0217 14:05:29.550740 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d76421b-4776-498f-a065-58f55d0e6e19-utilities\") pod \"certified-operators-2czqf\" (UID: \"4d76421b-4776-498f-a065-58f55d0e6e19\") " pod="openshift-marketplace/certified-operators-2czqf" Feb 17 14:05:29 crc kubenswrapper[4804]: I0217 14:05:29.550804 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29lf6\" (UniqueName: \"kubernetes.io/projected/4d76421b-4776-498f-a065-58f55d0e6e19-kube-api-access-29lf6\") pod \"certified-operators-2czqf\" (UID: \"4d76421b-4776-498f-a065-58f55d0e6e19\") " pod="openshift-marketplace/certified-operators-2czqf" Feb 17 14:05:29 crc kubenswrapper[4804]: I0217 14:05:29.550901 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d76421b-4776-498f-a065-58f55d0e6e19-catalog-content\") pod \"certified-operators-2czqf\" (UID: \"4d76421b-4776-498f-a065-58f55d0e6e19\") " pod="openshift-marketplace/certified-operators-2czqf" Feb 17 14:05:29 crc kubenswrapper[4804]: I0217 14:05:29.551418 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d76421b-4776-498f-a065-58f55d0e6e19-catalog-content\") pod \"certified-operators-2czqf\" (UID: \"4d76421b-4776-498f-a065-58f55d0e6e19\") " pod="openshift-marketplace/certified-operators-2czqf" Feb 17 14:05:29 crc kubenswrapper[4804]: I0217 14:05:29.551634 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d76421b-4776-498f-a065-58f55d0e6e19-utilities\") pod \"certified-operators-2czqf\" (UID: \"4d76421b-4776-498f-a065-58f55d0e6e19\") " pod="openshift-marketplace/certified-operators-2czqf" Feb 17 14:05:29 crc kubenswrapper[4804]: I0217 14:05:29.572593 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29lf6\" (UniqueName: \"kubernetes.io/projected/4d76421b-4776-498f-a065-58f55d0e6e19-kube-api-access-29lf6\") pod \"certified-operators-2czqf\" (UID: \"4d76421b-4776-498f-a065-58f55d0e6e19\") " pod="openshift-marketplace/certified-operators-2czqf" Feb 17 14:05:29 crc kubenswrapper[4804]: I0217 14:05:29.652226 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2czqf" Feb 17 14:05:30 crc kubenswrapper[4804]: I0217 14:05:30.118953 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2czqf"] Feb 17 14:05:30 crc kubenswrapper[4804]: I0217 14:05:30.658766 4804 generic.go:334] "Generic (PLEG): container finished" podID="4d76421b-4776-498f-a065-58f55d0e6e19" containerID="1a4df0575144925b5b5b52e0b974d9ed4ae42a22497d6b980d5cf223b19a02fb" exitCode=0 Feb 17 14:05:30 crc kubenswrapper[4804]: I0217 14:05:30.658838 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2czqf" event={"ID":"4d76421b-4776-498f-a065-58f55d0e6e19","Type":"ContainerDied","Data":"1a4df0575144925b5b5b52e0b974d9ed4ae42a22497d6b980d5cf223b19a02fb"} Feb 17 14:05:30 crc kubenswrapper[4804]: I0217 14:05:30.658902 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2czqf" event={"ID":"4d76421b-4776-498f-a065-58f55d0e6e19","Type":"ContainerStarted","Data":"7ad75112ffb62ecf766d5c438a715c9d96e81a84818088bc4b50cdcb499f5951"} Feb 17 14:05:31 crc kubenswrapper[4804]: I0217 14:05:31.684675 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2czqf" event={"ID":"4d76421b-4776-498f-a065-58f55d0e6e19","Type":"ContainerStarted","Data":"6c782bf72b3e6b23a97963dc8b9004a701f337d828238e2155052f6e88f9a05e"} Feb 17 14:05:32 crc kubenswrapper[4804]: I0217 14:05:32.696622 4804 generic.go:334] "Generic (PLEG): container finished" podID="4d76421b-4776-498f-a065-58f55d0e6e19" containerID="6c782bf72b3e6b23a97963dc8b9004a701f337d828238e2155052f6e88f9a05e" exitCode=0 Feb 17 14:05:32 crc kubenswrapper[4804]: I0217 14:05:32.696681 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2czqf" event={"ID":"4d76421b-4776-498f-a065-58f55d0e6e19","Type":"ContainerDied","Data":"6c782bf72b3e6b23a97963dc8b9004a701f337d828238e2155052f6e88f9a05e"} Feb 17 14:05:33 crc kubenswrapper[4804]: I0217 14:05:33.707060 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2czqf" event={"ID":"4d76421b-4776-498f-a065-58f55d0e6e19","Type":"ContainerStarted","Data":"f9afca94b3afd95ee78d3e47148abcd6ac029939c61406abdc0531935f323194"} Feb 17 14:05:33 crc kubenswrapper[4804]: I0217 14:05:33.730379 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2czqf" podStartSLOduration=2.003229686 podStartE2EDuration="4.730362935s" podCreationTimestamp="2026-02-17 14:05:29 +0000 UTC" firstStartedPulling="2026-02-17 14:05:30.660699477 +0000 UTC m=+2404.772118814" lastFinishedPulling="2026-02-17 14:05:33.387832726 +0000 UTC m=+2407.499252063" observedRunningTime="2026-02-17 14:05:33.721997946 +0000 UTC m=+2407.833417293" watchObservedRunningTime="2026-02-17 14:05:33.730362935 +0000 UTC m=+2407.841782272" Feb 17 14:05:37 crc kubenswrapper[4804]: I0217 14:05:37.574120 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:05:37 crc kubenswrapper[4804]: E0217 14:05:37.574727 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:05:39 crc kubenswrapper[4804]: I0217 14:05:39.653821 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2czqf" Feb 17 14:05:39 crc kubenswrapper[4804]: I0217 14:05:39.654153 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2czqf" Feb 17 14:05:39 crc kubenswrapper[4804]: I0217 14:05:39.704801 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2czqf" Feb 17 14:05:39 crc kubenswrapper[4804]: I0217 14:05:39.824187 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2czqf" Feb 17 14:05:39 crc kubenswrapper[4804]: I0217 14:05:39.942173 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2czqf"] Feb 17 14:05:41 crc kubenswrapper[4804]: I0217 14:05:41.790892 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2czqf" podUID="4d76421b-4776-498f-a065-58f55d0e6e19" containerName="registry-server" containerID="cri-o://f9afca94b3afd95ee78d3e47148abcd6ac029939c61406abdc0531935f323194" gracePeriod=2 Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.264657 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2czqf" Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.424009 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d76421b-4776-498f-a065-58f55d0e6e19-utilities\") pod \"4d76421b-4776-498f-a065-58f55d0e6e19\" (UID: \"4d76421b-4776-498f-a065-58f55d0e6e19\") " Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.424155 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d76421b-4776-498f-a065-58f55d0e6e19-catalog-content\") pod \"4d76421b-4776-498f-a065-58f55d0e6e19\" (UID: \"4d76421b-4776-498f-a065-58f55d0e6e19\") " Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.424243 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29lf6\" (UniqueName: \"kubernetes.io/projected/4d76421b-4776-498f-a065-58f55d0e6e19-kube-api-access-29lf6\") pod \"4d76421b-4776-498f-a065-58f55d0e6e19\" (UID: \"4d76421b-4776-498f-a065-58f55d0e6e19\") " Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.427178 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d76421b-4776-498f-a065-58f55d0e6e19-utilities" (OuterVolumeSpecName: "utilities") pod "4d76421b-4776-498f-a065-58f55d0e6e19" (UID: "4d76421b-4776-498f-a065-58f55d0e6e19"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.431480 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d76421b-4776-498f-a065-58f55d0e6e19-kube-api-access-29lf6" (OuterVolumeSpecName: "kube-api-access-29lf6") pod "4d76421b-4776-498f-a065-58f55d0e6e19" (UID: "4d76421b-4776-498f-a065-58f55d0e6e19"). InnerVolumeSpecName "kube-api-access-29lf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.488714 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d76421b-4776-498f-a065-58f55d0e6e19-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4d76421b-4776-498f-a065-58f55d0e6e19" (UID: "4d76421b-4776-498f-a065-58f55d0e6e19"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.527835 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d76421b-4776-498f-a065-58f55d0e6e19-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.528056 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d76421b-4776-498f-a065-58f55d0e6e19-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.528172 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29lf6\" (UniqueName: \"kubernetes.io/projected/4d76421b-4776-498f-a065-58f55d0e6e19-kube-api-access-29lf6\") on node \"crc\" DevicePath \"\"" Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.800836 4804 generic.go:334] "Generic (PLEG): container finished" podID="4d76421b-4776-498f-a065-58f55d0e6e19" containerID="f9afca94b3afd95ee78d3e47148abcd6ac029939c61406abdc0531935f323194" exitCode=0 Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.800900 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2czqf" Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.800902 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2czqf" event={"ID":"4d76421b-4776-498f-a065-58f55d0e6e19","Type":"ContainerDied","Data":"f9afca94b3afd95ee78d3e47148abcd6ac029939c61406abdc0531935f323194"} Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.801284 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2czqf" event={"ID":"4d76421b-4776-498f-a065-58f55d0e6e19","Type":"ContainerDied","Data":"7ad75112ffb62ecf766d5c438a715c9d96e81a84818088bc4b50cdcb499f5951"} Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.801330 4804 scope.go:117] "RemoveContainer" containerID="f9afca94b3afd95ee78d3e47148abcd6ac029939c61406abdc0531935f323194" Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.898641 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2czqf"] Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.908513 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2czqf"] Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.915102 4804 scope.go:117] "RemoveContainer" containerID="6c782bf72b3e6b23a97963dc8b9004a701f337d828238e2155052f6e88f9a05e" Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.941089 4804 scope.go:117] "RemoveContainer" containerID="1a4df0575144925b5b5b52e0b974d9ed4ae42a22497d6b980d5cf223b19a02fb" Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.985895 4804 scope.go:117] "RemoveContainer" containerID="f9afca94b3afd95ee78d3e47148abcd6ac029939c61406abdc0531935f323194" Feb 17 14:05:42 crc kubenswrapper[4804]: E0217 14:05:42.986322 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9afca94b3afd95ee78d3e47148abcd6ac029939c61406abdc0531935f323194\": container with ID starting with f9afca94b3afd95ee78d3e47148abcd6ac029939c61406abdc0531935f323194 not found: ID does not exist" containerID="f9afca94b3afd95ee78d3e47148abcd6ac029939c61406abdc0531935f323194" Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.986363 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9afca94b3afd95ee78d3e47148abcd6ac029939c61406abdc0531935f323194"} err="failed to get container status \"f9afca94b3afd95ee78d3e47148abcd6ac029939c61406abdc0531935f323194\": rpc error: code = NotFound desc = could not find container \"f9afca94b3afd95ee78d3e47148abcd6ac029939c61406abdc0531935f323194\": container with ID starting with f9afca94b3afd95ee78d3e47148abcd6ac029939c61406abdc0531935f323194 not found: ID does not exist" Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.986389 4804 scope.go:117] "RemoveContainer" containerID="6c782bf72b3e6b23a97963dc8b9004a701f337d828238e2155052f6e88f9a05e" Feb 17 14:05:42 crc kubenswrapper[4804]: E0217 14:05:42.986604 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c782bf72b3e6b23a97963dc8b9004a701f337d828238e2155052f6e88f9a05e\": container with ID starting with 6c782bf72b3e6b23a97963dc8b9004a701f337d828238e2155052f6e88f9a05e not found: ID does not exist" containerID="6c782bf72b3e6b23a97963dc8b9004a701f337d828238e2155052f6e88f9a05e" Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.986627 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c782bf72b3e6b23a97963dc8b9004a701f337d828238e2155052f6e88f9a05e"} err="failed to get container status \"6c782bf72b3e6b23a97963dc8b9004a701f337d828238e2155052f6e88f9a05e\": rpc error: code = NotFound desc = could not find container \"6c782bf72b3e6b23a97963dc8b9004a701f337d828238e2155052f6e88f9a05e\": container with ID starting with 6c782bf72b3e6b23a97963dc8b9004a701f337d828238e2155052f6e88f9a05e not found: ID does not exist" Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.986642 4804 scope.go:117] "RemoveContainer" containerID="1a4df0575144925b5b5b52e0b974d9ed4ae42a22497d6b980d5cf223b19a02fb" Feb 17 14:05:42 crc kubenswrapper[4804]: E0217 14:05:42.986842 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a4df0575144925b5b5b52e0b974d9ed4ae42a22497d6b980d5cf223b19a02fb\": container with ID starting with 1a4df0575144925b5b5b52e0b974d9ed4ae42a22497d6b980d5cf223b19a02fb not found: ID does not exist" containerID="1a4df0575144925b5b5b52e0b974d9ed4ae42a22497d6b980d5cf223b19a02fb" Feb 17 14:05:42 crc kubenswrapper[4804]: I0217 14:05:42.986869 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a4df0575144925b5b5b52e0b974d9ed4ae42a22497d6b980d5cf223b19a02fb"} err="failed to get container status \"1a4df0575144925b5b5b52e0b974d9ed4ae42a22497d6b980d5cf223b19a02fb\": rpc error: code = NotFound desc = could not find container \"1a4df0575144925b5b5b52e0b974d9ed4ae42a22497d6b980d5cf223b19a02fb\": container with ID starting with 1a4df0575144925b5b5b52e0b974d9ed4ae42a22497d6b980d5cf223b19a02fb not found: ID does not exist" Feb 17 14:05:44 crc kubenswrapper[4804]: I0217 14:05:44.595491 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d76421b-4776-498f-a065-58f55d0e6e19" path="/var/lib/kubelet/pods/4d76421b-4776-498f-a065-58f55d0e6e19/volumes" Feb 17 14:05:48 crc kubenswrapper[4804]: I0217 14:05:48.574042 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:05:48 crc kubenswrapper[4804]: E0217 14:05:48.575672 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:06:00 crc kubenswrapper[4804]: I0217 14:06:00.574063 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:06:00 crc kubenswrapper[4804]: E0217 14:06:00.574833 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:06:14 crc kubenswrapper[4804]: I0217 14:06:14.574627 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:06:14 crc kubenswrapper[4804]: E0217 14:06:14.575639 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:06:27 crc kubenswrapper[4804]: I0217 14:06:27.574537 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:06:27 crc kubenswrapper[4804]: E0217 14:06:27.576655 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:06:38 crc kubenswrapper[4804]: I0217 14:06:38.862934 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lhfth"] Feb 17 14:06:38 crc kubenswrapper[4804]: E0217 14:06:38.863971 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d76421b-4776-498f-a065-58f55d0e6e19" containerName="extract-content" Feb 17 14:06:38 crc kubenswrapper[4804]: I0217 14:06:38.863989 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d76421b-4776-498f-a065-58f55d0e6e19" containerName="extract-content" Feb 17 14:06:38 crc kubenswrapper[4804]: E0217 14:06:38.864019 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d76421b-4776-498f-a065-58f55d0e6e19" containerName="extract-utilities" Feb 17 14:06:38 crc kubenswrapper[4804]: I0217 14:06:38.864027 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d76421b-4776-498f-a065-58f55d0e6e19" containerName="extract-utilities" Feb 17 14:06:38 crc kubenswrapper[4804]: E0217 14:06:38.864044 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d76421b-4776-498f-a065-58f55d0e6e19" containerName="registry-server" Feb 17 14:06:38 crc kubenswrapper[4804]: I0217 14:06:38.864051 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d76421b-4776-498f-a065-58f55d0e6e19" containerName="registry-server" Feb 17 14:06:38 crc kubenswrapper[4804]: I0217 14:06:38.864274 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d76421b-4776-498f-a065-58f55d0e6e19" containerName="registry-server" Feb 17 14:06:38 crc kubenswrapper[4804]: I0217 14:06:38.865917 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lhfth" Feb 17 14:06:38 crc kubenswrapper[4804]: I0217 14:06:38.881964 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lhfth"] Feb 17 14:06:38 crc kubenswrapper[4804]: I0217 14:06:38.885260 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/185fa21c-049f-41b3-9031-318a3c21ecef-catalog-content\") pod \"redhat-operators-lhfth\" (UID: \"185fa21c-049f-41b3-9031-318a3c21ecef\") " pod="openshift-marketplace/redhat-operators-lhfth" Feb 17 14:06:38 crc kubenswrapper[4804]: I0217 14:06:38.885391 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/185fa21c-049f-41b3-9031-318a3c21ecef-utilities\") pod \"redhat-operators-lhfth\" (UID: \"185fa21c-049f-41b3-9031-318a3c21ecef\") " pod="openshift-marketplace/redhat-operators-lhfth" Feb 17 14:06:38 crc kubenswrapper[4804]: I0217 14:06:38.885474 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdsv9\" (UniqueName: \"kubernetes.io/projected/185fa21c-049f-41b3-9031-318a3c21ecef-kube-api-access-qdsv9\") pod \"redhat-operators-lhfth\" (UID: \"185fa21c-049f-41b3-9031-318a3c21ecef\") " pod="openshift-marketplace/redhat-operators-lhfth" Feb 17 14:06:38 crc kubenswrapper[4804]: I0217 14:06:38.987217 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/185fa21c-049f-41b3-9031-318a3c21ecef-catalog-content\") pod \"redhat-operators-lhfth\" (UID: \"185fa21c-049f-41b3-9031-318a3c21ecef\") " pod="openshift-marketplace/redhat-operators-lhfth" Feb 17 14:06:38 crc kubenswrapper[4804]: I0217 14:06:38.987690 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/185fa21c-049f-41b3-9031-318a3c21ecef-catalog-content\") pod \"redhat-operators-lhfth\" (UID: \"185fa21c-049f-41b3-9031-318a3c21ecef\") " pod="openshift-marketplace/redhat-operators-lhfth" Feb 17 14:06:38 crc kubenswrapper[4804]: I0217 14:06:38.987706 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/185fa21c-049f-41b3-9031-318a3c21ecef-utilities\") pod \"redhat-operators-lhfth\" (UID: \"185fa21c-049f-41b3-9031-318a3c21ecef\") " pod="openshift-marketplace/redhat-operators-lhfth" Feb 17 14:06:38 crc kubenswrapper[4804]: I0217 14:06:38.987785 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdsv9\" (UniqueName: \"kubernetes.io/projected/185fa21c-049f-41b3-9031-318a3c21ecef-kube-api-access-qdsv9\") pod \"redhat-operators-lhfth\" (UID: \"185fa21c-049f-41b3-9031-318a3c21ecef\") " pod="openshift-marketplace/redhat-operators-lhfth" Feb 17 14:06:38 crc kubenswrapper[4804]: I0217 14:06:38.987965 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/185fa21c-049f-41b3-9031-318a3c21ecef-utilities\") pod \"redhat-operators-lhfth\" (UID: \"185fa21c-049f-41b3-9031-318a3c21ecef\") " pod="openshift-marketplace/redhat-operators-lhfth" Feb 17 14:06:39 crc kubenswrapper[4804]: I0217 14:06:39.010008 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdsv9\" (UniqueName: \"kubernetes.io/projected/185fa21c-049f-41b3-9031-318a3c21ecef-kube-api-access-qdsv9\") pod \"redhat-operators-lhfth\" (UID: \"185fa21c-049f-41b3-9031-318a3c21ecef\") " pod="openshift-marketplace/redhat-operators-lhfth" Feb 17 14:06:39 crc kubenswrapper[4804]: I0217 14:06:39.186566 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lhfth" Feb 17 14:06:39 crc kubenswrapper[4804]: I0217 14:06:39.574073 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:06:39 crc kubenswrapper[4804]: E0217 14:06:39.574698 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:06:39 crc kubenswrapper[4804]: I0217 14:06:39.701469 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lhfth"] Feb 17 14:06:40 crc kubenswrapper[4804]: I0217 14:06:40.375565 4804 generic.go:334] "Generic (PLEG): container finished" podID="185fa21c-049f-41b3-9031-318a3c21ecef" containerID="63d2f999f3b20ab902c761ba7009b771de4228c739dc86596893270097ddcd50" exitCode=0 Feb 17 14:06:40 crc kubenswrapper[4804]: I0217 14:06:40.375650 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhfth" event={"ID":"185fa21c-049f-41b3-9031-318a3c21ecef","Type":"ContainerDied","Data":"63d2f999f3b20ab902c761ba7009b771de4228c739dc86596893270097ddcd50"} Feb 17 14:06:40 crc kubenswrapper[4804]: I0217 14:06:40.375942 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhfth" event={"ID":"185fa21c-049f-41b3-9031-318a3c21ecef","Type":"ContainerStarted","Data":"71c0d4db4d6b00eb21e2b36531f6d666ffe8b21b661819d2b23f0bd4323aa817"} Feb 17 14:06:40 crc kubenswrapper[4804]: I0217 14:06:40.377968 4804 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 14:06:42 crc kubenswrapper[4804]: I0217 14:06:42.394378 4804 generic.go:334] "Generic (PLEG): container finished" podID="185fa21c-049f-41b3-9031-318a3c21ecef" containerID="f01193bdf1e82e2a6e3fea01bf103d3eb3adc6c861f2b2ac1692bb87fc0b6c46" exitCode=0 Feb 17 14:06:42 crc kubenswrapper[4804]: I0217 14:06:42.394434 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhfth" event={"ID":"185fa21c-049f-41b3-9031-318a3c21ecef","Type":"ContainerDied","Data":"f01193bdf1e82e2a6e3fea01bf103d3eb3adc6c861f2b2ac1692bb87fc0b6c46"} Feb 17 14:06:43 crc kubenswrapper[4804]: I0217 14:06:43.412406 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhfth" event={"ID":"185fa21c-049f-41b3-9031-318a3c21ecef","Type":"ContainerStarted","Data":"a4f78e134f4672850b3042a62c45a06ef12ddc5102c5309d4d677ecae6fb3127"} Feb 17 14:06:43 crc kubenswrapper[4804]: I0217 14:06:43.436540 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lhfth" podStartSLOduration=2.853122149 podStartE2EDuration="5.436518053s" podCreationTimestamp="2026-02-17 14:06:38 +0000 UTC" firstStartedPulling="2026-02-17 14:06:40.376981549 +0000 UTC m=+2474.488400886" lastFinishedPulling="2026-02-17 14:06:42.960377453 +0000 UTC m=+2477.071796790" observedRunningTime="2026-02-17 14:06:43.428606638 +0000 UTC m=+2477.540025975" watchObservedRunningTime="2026-02-17 14:06:43.436518053 +0000 UTC m=+2477.547937390" Feb 17 14:06:49 crc kubenswrapper[4804]: I0217 14:06:49.187231 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lhfth" Feb 17 14:06:49 crc kubenswrapper[4804]: I0217 14:06:49.189538 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lhfth" Feb 17 14:06:49 crc kubenswrapper[4804]: I0217 14:06:49.252776 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lhfth" Feb 17 14:06:49 crc kubenswrapper[4804]: I0217 14:06:49.543314 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lhfth" Feb 17 14:06:49 crc kubenswrapper[4804]: I0217 14:06:49.590845 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lhfth"] Feb 17 14:06:51 crc kubenswrapper[4804]: I0217 14:06:51.511217 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lhfth" podUID="185fa21c-049f-41b3-9031-318a3c21ecef" containerName="registry-server" containerID="cri-o://a4f78e134f4672850b3042a62c45a06ef12ddc5102c5309d4d677ecae6fb3127" gracePeriod=2 Feb 17 14:06:52 crc kubenswrapper[4804]: I0217 14:06:52.526069 4804 generic.go:334] "Generic (PLEG): container finished" podID="185fa21c-049f-41b3-9031-318a3c21ecef" containerID="a4f78e134f4672850b3042a62c45a06ef12ddc5102c5309d4d677ecae6fb3127" exitCode=0 Feb 17 14:06:52 crc kubenswrapper[4804]: I0217 14:06:52.526230 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhfth" event={"ID":"185fa21c-049f-41b3-9031-318a3c21ecef","Type":"ContainerDied","Data":"a4f78e134f4672850b3042a62c45a06ef12ddc5102c5309d4d677ecae6fb3127"} Feb 17 14:06:52 crc kubenswrapper[4804]: I0217 14:06:52.526420 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhfth" event={"ID":"185fa21c-049f-41b3-9031-318a3c21ecef","Type":"ContainerDied","Data":"71c0d4db4d6b00eb21e2b36531f6d666ffe8b21b661819d2b23f0bd4323aa817"} Feb 17 14:06:52 crc kubenswrapper[4804]: I0217 14:06:52.526442 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71c0d4db4d6b00eb21e2b36531f6d666ffe8b21b661819d2b23f0bd4323aa817" Feb 17 14:06:52 crc kubenswrapper[4804]: I0217 14:06:52.574165 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:06:52 crc kubenswrapper[4804]: E0217 14:06:52.574552 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:06:52 crc kubenswrapper[4804]: I0217 14:06:52.615848 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lhfth" Feb 17 14:06:52 crc kubenswrapper[4804]: I0217 14:06:52.766820 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdsv9\" (UniqueName: \"kubernetes.io/projected/185fa21c-049f-41b3-9031-318a3c21ecef-kube-api-access-qdsv9\") pod \"185fa21c-049f-41b3-9031-318a3c21ecef\" (UID: \"185fa21c-049f-41b3-9031-318a3c21ecef\") " Feb 17 14:06:52 crc kubenswrapper[4804]: I0217 14:06:52.767090 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/185fa21c-049f-41b3-9031-318a3c21ecef-catalog-content\") pod \"185fa21c-049f-41b3-9031-318a3c21ecef\" (UID: \"185fa21c-049f-41b3-9031-318a3c21ecef\") " Feb 17 14:06:52 crc kubenswrapper[4804]: I0217 14:06:52.767153 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/185fa21c-049f-41b3-9031-318a3c21ecef-utilities\") pod \"185fa21c-049f-41b3-9031-318a3c21ecef\" (UID: \"185fa21c-049f-41b3-9031-318a3c21ecef\") " Feb 17 14:06:52 crc kubenswrapper[4804]: I0217 14:06:52.770117 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/185fa21c-049f-41b3-9031-318a3c21ecef-utilities" (OuterVolumeSpecName: "utilities") pod "185fa21c-049f-41b3-9031-318a3c21ecef" (UID: "185fa21c-049f-41b3-9031-318a3c21ecef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:06:52 crc kubenswrapper[4804]: I0217 14:06:52.775256 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/185fa21c-049f-41b3-9031-318a3c21ecef-kube-api-access-qdsv9" (OuterVolumeSpecName: "kube-api-access-qdsv9") pod "185fa21c-049f-41b3-9031-318a3c21ecef" (UID: "185fa21c-049f-41b3-9031-318a3c21ecef"). InnerVolumeSpecName "kube-api-access-qdsv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:52 crc kubenswrapper[4804]: I0217 14:06:52.869592 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/185fa21c-049f-41b3-9031-318a3c21ecef-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:52 crc kubenswrapper[4804]: I0217 14:06:52.869638 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdsv9\" (UniqueName: \"kubernetes.io/projected/185fa21c-049f-41b3-9031-318a3c21ecef-kube-api-access-qdsv9\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:52 crc kubenswrapper[4804]: I0217 14:06:52.895037 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/185fa21c-049f-41b3-9031-318a3c21ecef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "185fa21c-049f-41b3-9031-318a3c21ecef" (UID: "185fa21c-049f-41b3-9031-318a3c21ecef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:06:52 crc kubenswrapper[4804]: I0217 14:06:52.971675 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/185fa21c-049f-41b3-9031-318a3c21ecef-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:53 crc kubenswrapper[4804]: I0217 14:06:53.535628 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lhfth" Feb 17 14:06:53 crc kubenswrapper[4804]: I0217 14:06:53.573686 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lhfth"] Feb 17 14:06:53 crc kubenswrapper[4804]: I0217 14:06:53.586921 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lhfth"] Feb 17 14:06:54 crc kubenswrapper[4804]: I0217 14:06:54.587468 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="185fa21c-049f-41b3-9031-318a3c21ecef" path="/var/lib/kubelet/pods/185fa21c-049f-41b3-9031-318a3c21ecef/volumes" Feb 17 14:07:03 crc kubenswrapper[4804]: I0217 14:07:03.574435 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:07:03 crc kubenswrapper[4804]: E0217 14:07:03.575379 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:07:15 crc kubenswrapper[4804]: I0217 14:07:15.574507 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:07:15 crc kubenswrapper[4804]: E0217 14:07:15.575808 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:07:25 crc kubenswrapper[4804]: I0217 14:07:25.823021 4804 generic.go:334] "Generic (PLEG): container finished" podID="9f17dd92-0402-40c7-bdc7-50b38e37f750" containerID="dbda66ccca14c400ac04b20a535081d1e040f266a1132798c4aceb72485b84fa" exitCode=0 Feb 17 14:07:25 crc kubenswrapper[4804]: I0217 14:07:25.823154 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" event={"ID":"9f17dd92-0402-40c7-bdc7-50b38e37f750","Type":"ContainerDied","Data":"dbda66ccca14c400ac04b20a535081d1e040f266a1132798c4aceb72485b84fa"} Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.227239 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.274587 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-inventory\") pod \"9f17dd92-0402-40c7-bdc7-50b38e37f750\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.274657 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-combined-ca-bundle\") pod \"9f17dd92-0402-40c7-bdc7-50b38e37f750\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.274678 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-ssh-key-openstack-edpm-ipam\") pod \"9f17dd92-0402-40c7-bdc7-50b38e37f750\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.274725 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-extra-config-0\") pod \"9f17dd92-0402-40c7-bdc7-50b38e37f750\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.274760 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-migration-ssh-key-0\") pod \"9f17dd92-0402-40c7-bdc7-50b38e37f750\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.274819 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gv28f\" (UniqueName: \"kubernetes.io/projected/9f17dd92-0402-40c7-bdc7-50b38e37f750-kube-api-access-gv28f\") pod \"9f17dd92-0402-40c7-bdc7-50b38e37f750\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.274847 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-migration-ssh-key-1\") pod \"9f17dd92-0402-40c7-bdc7-50b38e37f750\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.274871 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-cell1-compute-config-0\") pod \"9f17dd92-0402-40c7-bdc7-50b38e37f750\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.274900 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-cell1-compute-config-1\") pod \"9f17dd92-0402-40c7-bdc7-50b38e37f750\" (UID: \"9f17dd92-0402-40c7-bdc7-50b38e37f750\") " Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.280446 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "9f17dd92-0402-40c7-bdc7-50b38e37f750" (UID: "9f17dd92-0402-40c7-bdc7-50b38e37f750"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.280678 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f17dd92-0402-40c7-bdc7-50b38e37f750-kube-api-access-gv28f" (OuterVolumeSpecName: "kube-api-access-gv28f") pod "9f17dd92-0402-40c7-bdc7-50b38e37f750" (UID: "9f17dd92-0402-40c7-bdc7-50b38e37f750"). InnerVolumeSpecName "kube-api-access-gv28f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.307022 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-inventory" (OuterVolumeSpecName: "inventory") pod "9f17dd92-0402-40c7-bdc7-50b38e37f750" (UID: "9f17dd92-0402-40c7-bdc7-50b38e37f750"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.307538 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "9f17dd92-0402-40c7-bdc7-50b38e37f750" (UID: "9f17dd92-0402-40c7-bdc7-50b38e37f750"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.308837 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "9f17dd92-0402-40c7-bdc7-50b38e37f750" (UID: "9f17dd92-0402-40c7-bdc7-50b38e37f750"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.310403 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "9f17dd92-0402-40c7-bdc7-50b38e37f750" (UID: "9f17dd92-0402-40c7-bdc7-50b38e37f750"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.311500 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "9f17dd92-0402-40c7-bdc7-50b38e37f750" (UID: "9f17dd92-0402-40c7-bdc7-50b38e37f750"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.315404 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9f17dd92-0402-40c7-bdc7-50b38e37f750" (UID: "9f17dd92-0402-40c7-bdc7-50b38e37f750"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.319453 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "9f17dd92-0402-40c7-bdc7-50b38e37f750" (UID: "9f17dd92-0402-40c7-bdc7-50b38e37f750"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.376849 4804 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.377171 4804 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.377323 4804 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.377513 4804 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.377664 4804 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.377758 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gv28f\" (UniqueName: \"kubernetes.io/projected/9f17dd92-0402-40c7-bdc7-50b38e37f750-kube-api-access-gv28f\") on node \"crc\" DevicePath \"\"" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.377851 4804 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.377947 4804 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.378004 4804 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9f17dd92-0402-40c7-bdc7-50b38e37f750-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.575339 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:07:27 crc kubenswrapper[4804]: E0217 14:07:27.576094 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.844668 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" event={"ID":"9f17dd92-0402-40c7-bdc7-50b38e37f750","Type":"ContainerDied","Data":"3737f6f0131a3f5e82616cb8a9012b910af27541cdeac5c9048e6ea1b4d2299d"} Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.844715 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3737f6f0131a3f5e82616cb8a9012b910af27541cdeac5c9048e6ea1b4d2299d" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.844755 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x8lml" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.935988 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55"] Feb 17 14:07:27 crc kubenswrapper[4804]: E0217 14:07:27.936432 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="185fa21c-049f-41b3-9031-318a3c21ecef" containerName="extract-utilities" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.936452 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="185fa21c-049f-41b3-9031-318a3c21ecef" containerName="extract-utilities" Feb 17 14:07:27 crc kubenswrapper[4804]: E0217 14:07:27.936463 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="185fa21c-049f-41b3-9031-318a3c21ecef" containerName="registry-server" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.936472 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="185fa21c-049f-41b3-9031-318a3c21ecef" containerName="registry-server" Feb 17 14:07:27 crc kubenswrapper[4804]: E0217 14:07:27.936496 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f17dd92-0402-40c7-bdc7-50b38e37f750" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.936504 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f17dd92-0402-40c7-bdc7-50b38e37f750" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 17 14:07:27 crc kubenswrapper[4804]: E0217 14:07:27.936539 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="185fa21c-049f-41b3-9031-318a3c21ecef" containerName="extract-content" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.936546 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="185fa21c-049f-41b3-9031-318a3c21ecef" containerName="extract-content" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.936782 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="185fa21c-049f-41b3-9031-318a3c21ecef" containerName="registry-server" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.936803 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f17dd92-0402-40c7-bdc7-50b38e37f750" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.937532 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.939779 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.939922 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.939934 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-29gqz" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.940013 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.940256 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.955244 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55"] Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.988070 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.988131 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.988166 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.988227 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnn4c\" (UniqueName: \"kubernetes.io/projected/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-kube-api-access-cnn4c\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.988260 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.988416 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:27 crc kubenswrapper[4804]: I0217 14:07:27.988629 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:28 crc kubenswrapper[4804]: I0217 14:07:28.089985 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnn4c\" (UniqueName: \"kubernetes.io/projected/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-kube-api-access-cnn4c\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:28 crc kubenswrapper[4804]: I0217 14:07:28.090059 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:28 crc kubenswrapper[4804]: I0217 14:07:28.090114 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:28 crc kubenswrapper[4804]: I0217 14:07:28.090217 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:28 crc kubenswrapper[4804]: I0217 14:07:28.090293 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:28 crc kubenswrapper[4804]: I0217 14:07:28.090319 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:28 crc kubenswrapper[4804]: I0217 14:07:28.090355 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:28 crc kubenswrapper[4804]: I0217 14:07:28.095865 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:28 crc kubenswrapper[4804]: I0217 14:07:28.096063 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:28 crc kubenswrapper[4804]: I0217 14:07:28.096472 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:28 crc kubenswrapper[4804]: I0217 14:07:28.096874 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:28 crc kubenswrapper[4804]: I0217 14:07:28.097274 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:28 crc kubenswrapper[4804]: I0217 14:07:28.097485 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:28 crc kubenswrapper[4804]: I0217 14:07:28.108548 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnn4c\" (UniqueName: \"kubernetes.io/projected/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-kube-api-access-cnn4c\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wtq55\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:28 crc kubenswrapper[4804]: I0217 14:07:28.258372 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:07:28 crc kubenswrapper[4804]: I0217 14:07:28.760930 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55"] Feb 17 14:07:28 crc kubenswrapper[4804]: I0217 14:07:28.853853 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" event={"ID":"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7","Type":"ContainerStarted","Data":"d5c61072e65010bf5df8b12c5d629af6bb39700b5930687931000fc84258080e"} Feb 17 14:07:29 crc kubenswrapper[4804]: I0217 14:07:29.867396 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" event={"ID":"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7","Type":"ContainerStarted","Data":"27e33d208e79739a22cd976d57adcf8d08a28d6918b4dc63998a98c640c7b7d3"} Feb 17 14:07:42 crc kubenswrapper[4804]: I0217 14:07:42.575718 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:07:42 crc kubenswrapper[4804]: E0217 14:07:42.576490 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:07:57 crc kubenswrapper[4804]: I0217 14:07:57.574403 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:07:57 crc kubenswrapper[4804]: E0217 14:07:57.575138 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:08:10 crc kubenswrapper[4804]: I0217 14:08:10.574797 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:08:10 crc kubenswrapper[4804]: E0217 14:08:10.575591 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:08:21 crc kubenswrapper[4804]: I0217 14:08:21.574300 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:08:21 crc kubenswrapper[4804]: E0217 14:08:21.575040 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:08:35 crc kubenswrapper[4804]: I0217 14:08:35.574276 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:08:35 crc kubenswrapper[4804]: E0217 14:08:35.574931 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:08:48 crc kubenswrapper[4804]: I0217 14:08:48.574709 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:08:48 crc kubenswrapper[4804]: E0217 14:08:48.575611 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:09:00 crc kubenswrapper[4804]: I0217 14:09:00.591273 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:09:00 crc kubenswrapper[4804]: E0217 14:09:00.592376 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:09:13 crc kubenswrapper[4804]: I0217 14:09:13.574689 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:09:13 crc kubenswrapper[4804]: E0217 14:09:13.575467 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:09:24 crc kubenswrapper[4804]: I0217 14:09:24.574731 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:09:24 crc kubenswrapper[4804]: E0217 14:09:24.575634 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:09:39 crc kubenswrapper[4804]: I0217 14:09:39.574468 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:09:39 crc kubenswrapper[4804]: E0217 14:09:39.575120 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:09:41 crc kubenswrapper[4804]: I0217 14:09:41.238877 4804 generic.go:334] "Generic (PLEG): container finished" podID="0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7" containerID="27e33d208e79739a22cd976d57adcf8d08a28d6918b4dc63998a98c640c7b7d3" exitCode=0 Feb 17 14:09:41 crc kubenswrapper[4804]: I0217 14:09:41.238975 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" event={"ID":"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7","Type":"ContainerDied","Data":"27e33d208e79739a22cd976d57adcf8d08a28d6918b4dc63998a98c640c7b7d3"} Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.707884 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.789568 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-inventory\") pod \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.789614 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ceilometer-compute-config-data-0\") pod \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.789682 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnn4c\" (UniqueName: \"kubernetes.io/projected/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-kube-api-access-cnn4c\") pod \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.789806 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ceilometer-compute-config-data-1\") pod \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.789836 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-telemetry-combined-ca-bundle\") pod \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.789883 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ceilometer-compute-config-data-2\") pod \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.789912 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ssh-key-openstack-edpm-ipam\") pod \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\" (UID: \"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7\") " Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.807429 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7" (UID: "0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.811392 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-kube-api-access-cnn4c" (OuterVolumeSpecName: "kube-api-access-cnn4c") pod "0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7" (UID: "0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7"). InnerVolumeSpecName "kube-api-access-cnn4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.820006 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7" (UID: "0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.821623 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7" (UID: "0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.826640 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7" (UID: "0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.827275 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7" (UID: "0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.829490 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-inventory" (OuterVolumeSpecName: "inventory") pod "0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7" (UID: "0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.892407 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnn4c\" (UniqueName: \"kubernetes.io/projected/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-kube-api-access-cnn4c\") on node \"crc\" DevicePath \"\"" Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.892660 4804 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.892673 4804 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.892686 4804 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.892695 4804 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.892705 4804 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 14:09:42 crc kubenswrapper[4804]: I0217 14:09:42.892713 4804 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 17 14:09:43 crc kubenswrapper[4804]: I0217 14:09:43.260588 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" event={"ID":"0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7","Type":"ContainerDied","Data":"d5c61072e65010bf5df8b12c5d629af6bb39700b5930687931000fc84258080e"} Feb 17 14:09:43 crc kubenswrapper[4804]: I0217 14:09:43.260637 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5c61072e65010bf5df8b12c5d629af6bb39700b5930687931000fc84258080e" Feb 17 14:09:43 crc kubenswrapper[4804]: I0217 14:09:43.260643 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wtq55" Feb 17 14:09:53 crc kubenswrapper[4804]: I0217 14:09:53.574097 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:09:53 crc kubenswrapper[4804]: E0217 14:09:53.574665 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:10:08 crc kubenswrapper[4804]: I0217 14:10:08.574190 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:10:08 crc kubenswrapper[4804]: E0217 14:10:08.575484 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:10:23 crc kubenswrapper[4804]: I0217 14:10:23.574092 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:10:23 crc kubenswrapper[4804]: E0217 14:10:23.574859 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.439106 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Feb 17 14:10:26 crc kubenswrapper[4804]: E0217 14:10:26.439817 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.439831 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.440014 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.440685 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.443536 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.444411 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.444858 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.445730 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-kssn4" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.454556 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.546635 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f7b246dc-1d07-4725-b471-88fe82584d24-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.546695 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-548bt\" (UniqueName: \"kubernetes.io/projected/f7b246dc-1d07-4725-b471-88fe82584d24-kube-api-access-548bt\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.546730 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/f7b246dc-1d07-4725-b471-88fe82584d24-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.546837 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f7b246dc-1d07-4725-b471-88fe82584d24-config-data\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.546877 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/f7b246dc-1d07-4725-b471-88fe82584d24-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.547077 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f7b246dc-1d07-4725-b471-88fe82584d24-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.547181 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.547357 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f7b246dc-1d07-4725-b471-88fe82584d24-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.547547 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/f7b246dc-1d07-4725-b471-88fe82584d24-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.648838 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/f7b246dc-1d07-4725-b471-88fe82584d24-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.648911 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f7b246dc-1d07-4725-b471-88fe82584d24-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.648978 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-548bt\" (UniqueName: \"kubernetes.io/projected/f7b246dc-1d07-4725-b471-88fe82584d24-kube-api-access-548bt\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.649019 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/f7b246dc-1d07-4725-b471-88fe82584d24-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.649046 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f7b246dc-1d07-4725-b471-88fe82584d24-config-data\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.649066 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/f7b246dc-1d07-4725-b471-88fe82584d24-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.649118 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f7b246dc-1d07-4725-b471-88fe82584d24-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.649148 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.649191 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f7b246dc-1d07-4725-b471-88fe82584d24-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.649653 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/f7b246dc-1d07-4725-b471-88fe82584d24-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.649649 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/f7b246dc-1d07-4725-b471-88fe82584d24-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.649983 4804 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.651572 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f7b246dc-1d07-4725-b471-88fe82584d24-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.655506 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.655681 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.657232 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/f7b246dc-1d07-4725-b471-88fe82584d24-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.660597 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f7b246dc-1d07-4725-b471-88fe82584d24-config-data\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.661671 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f7b246dc-1d07-4725-b471-88fe82584d24-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.663398 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f7b246dc-1d07-4725-b471-88fe82584d24-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.666889 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-548bt\" (UniqueName: \"kubernetes.io/projected/f7b246dc-1d07-4725-b471-88fe82584d24-kube-api-access-548bt\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.680717 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " pod="openstack/tempest-tests-tempest" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.779519 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-kssn4" Feb 17 14:10:26 crc kubenswrapper[4804]: I0217 14:10:26.787536 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 17 14:10:27 crc kubenswrapper[4804]: I0217 14:10:27.231165 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 17 14:10:27 crc kubenswrapper[4804]: I0217 14:10:27.668626 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"f7b246dc-1d07-4725-b471-88fe82584d24","Type":"ContainerStarted","Data":"35721c59346596c631486087761565338b01be7cc9c8b0659285af567a265321"} Feb 17 14:10:35 crc kubenswrapper[4804]: I0217 14:10:35.575127 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:10:36 crc kubenswrapper[4804]: I0217 14:10:36.768971 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerStarted","Data":"3270aad0cb169aa901bc212cfebf2f76e758028c073b7389992ac0738318b723"} Feb 17 14:11:04 crc kubenswrapper[4804]: E0217 14:11:04.550837 4804 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Feb 17 14:11:04 crc kubenswrapper[4804]: E0217 14:11:04.551455 4804 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-548bt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(f7b246dc-1d07-4725-b471-88fe82584d24): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:11:04 crc kubenswrapper[4804]: E0217 14:11:04.552737 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="f7b246dc-1d07-4725-b471-88fe82584d24" Feb 17 14:11:05 crc kubenswrapper[4804]: E0217 14:11:05.068441 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="f7b246dc-1d07-4725-b471-88fe82584d24" Feb 17 14:11:17 crc kubenswrapper[4804]: I0217 14:11:17.037078 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 17 14:11:18 crc kubenswrapper[4804]: I0217 14:11:18.209186 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"f7b246dc-1d07-4725-b471-88fe82584d24","Type":"ContainerStarted","Data":"d537c8e502573d470d3444dc025ba077411e9d8c16e3d0c7fcbea501f31e4c98"} Feb 17 14:11:18 crc kubenswrapper[4804]: I0217 14:11:18.238088 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.446061712 podStartE2EDuration="53.238065479s" podCreationTimestamp="2026-02-17 14:10:25 +0000 UTC" firstStartedPulling="2026-02-17 14:10:27.241142569 +0000 UTC m=+2701.352561906" lastFinishedPulling="2026-02-17 14:11:17.033146336 +0000 UTC m=+2751.144565673" observedRunningTime="2026-02-17 14:11:18.225375165 +0000 UTC m=+2752.336794502" watchObservedRunningTime="2026-02-17 14:11:18.238065479 +0000 UTC m=+2752.349484816" Feb 17 14:12:30 crc kubenswrapper[4804]: I0217 14:12:30.140605 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-v6fhg"] Feb 17 14:12:30 crc kubenswrapper[4804]: I0217 14:12:30.143432 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v6fhg" Feb 17 14:12:30 crc kubenswrapper[4804]: I0217 14:12:30.156154 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6fhg"] Feb 17 14:12:30 crc kubenswrapper[4804]: I0217 14:12:30.283136 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ml5d\" (UniqueName: \"kubernetes.io/projected/8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1-kube-api-access-5ml5d\") pod \"redhat-marketplace-v6fhg\" (UID: \"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1\") " pod="openshift-marketplace/redhat-marketplace-v6fhg" Feb 17 14:12:30 crc kubenswrapper[4804]: I0217 14:12:30.283190 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1-utilities\") pod \"redhat-marketplace-v6fhg\" (UID: \"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1\") " pod="openshift-marketplace/redhat-marketplace-v6fhg" Feb 17 14:12:30 crc kubenswrapper[4804]: I0217 14:12:30.283707 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1-catalog-content\") pod \"redhat-marketplace-v6fhg\" (UID: \"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1\") " pod="openshift-marketplace/redhat-marketplace-v6fhg" Feb 17 14:12:30 crc kubenswrapper[4804]: I0217 14:12:30.386234 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1-utilities\") pod \"redhat-marketplace-v6fhg\" (UID: \"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1\") " pod="openshift-marketplace/redhat-marketplace-v6fhg" Feb 17 14:12:30 crc kubenswrapper[4804]: I0217 14:12:30.386291 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ml5d\" (UniqueName: \"kubernetes.io/projected/8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1-kube-api-access-5ml5d\") pod \"redhat-marketplace-v6fhg\" (UID: \"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1\") " pod="openshift-marketplace/redhat-marketplace-v6fhg" Feb 17 14:12:30 crc kubenswrapper[4804]: I0217 14:12:30.386444 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1-catalog-content\") pod \"redhat-marketplace-v6fhg\" (UID: \"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1\") " pod="openshift-marketplace/redhat-marketplace-v6fhg" Feb 17 14:12:30 crc kubenswrapper[4804]: I0217 14:12:30.386892 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1-utilities\") pod \"redhat-marketplace-v6fhg\" (UID: \"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1\") " pod="openshift-marketplace/redhat-marketplace-v6fhg" Feb 17 14:12:30 crc kubenswrapper[4804]: I0217 14:12:30.386925 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1-catalog-content\") pod \"redhat-marketplace-v6fhg\" (UID: \"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1\") " pod="openshift-marketplace/redhat-marketplace-v6fhg" Feb 17 14:12:30 crc kubenswrapper[4804]: I0217 14:12:30.418175 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ml5d\" (UniqueName: \"kubernetes.io/projected/8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1-kube-api-access-5ml5d\") pod \"redhat-marketplace-v6fhg\" (UID: \"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1\") " pod="openshift-marketplace/redhat-marketplace-v6fhg" Feb 17 14:12:30 crc kubenswrapper[4804]: I0217 14:12:30.482249 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v6fhg" Feb 17 14:12:30 crc kubenswrapper[4804]: I0217 14:12:30.935642 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6fhg"] Feb 17 14:12:31 crc kubenswrapper[4804]: I0217 14:12:31.871857 4804 generic.go:334] "Generic (PLEG): container finished" podID="8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1" containerID="a6e80dd12b575e040cf83707a8f17bbbd44006a931291c56034df5502fa278d3" exitCode=0 Feb 17 14:12:31 crc kubenswrapper[4804]: I0217 14:12:31.871964 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6fhg" event={"ID":"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1","Type":"ContainerDied","Data":"a6e80dd12b575e040cf83707a8f17bbbd44006a931291c56034df5502fa278d3"} Feb 17 14:12:31 crc kubenswrapper[4804]: I0217 14:12:31.872243 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6fhg" event={"ID":"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1","Type":"ContainerStarted","Data":"fc548041e26d0614531ae99cf8e30f06221c5f4d8be0ee5276ce2c338d7913a8"} Feb 17 14:12:31 crc kubenswrapper[4804]: I0217 14:12:31.875989 4804 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 14:12:33 crc kubenswrapper[4804]: I0217 14:12:33.893537 4804 generic.go:334] "Generic (PLEG): container finished" podID="8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1" containerID="cb1ba2ff1c2492a4aa44fe8ec7c0a001fed6d1e45c3c3350c681bd7e9cbf2734" exitCode=0 Feb 17 14:12:33 crc kubenswrapper[4804]: I0217 14:12:33.893778 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6fhg" event={"ID":"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1","Type":"ContainerDied","Data":"cb1ba2ff1c2492a4aa44fe8ec7c0a001fed6d1e45c3c3350c681bd7e9cbf2734"} Feb 17 14:12:34 crc kubenswrapper[4804]: I0217 14:12:34.910134 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6fhg" event={"ID":"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1","Type":"ContainerStarted","Data":"6da9349288b511807551c3d38ce707bceb766b6c8772f0dee6d9ba6cac138913"} Feb 17 14:12:34 crc kubenswrapper[4804]: I0217 14:12:34.929387 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-v6fhg" podStartSLOduration=2.2719784499999998 podStartE2EDuration="4.929370771s" podCreationTimestamp="2026-02-17 14:12:30 +0000 UTC" firstStartedPulling="2026-02-17 14:12:31.874518708 +0000 UTC m=+2825.985938085" lastFinishedPulling="2026-02-17 14:12:34.531911069 +0000 UTC m=+2828.643330406" observedRunningTime="2026-02-17 14:12:34.925780368 +0000 UTC m=+2829.037199705" watchObservedRunningTime="2026-02-17 14:12:34.929370771 +0000 UTC m=+2829.040790108" Feb 17 14:12:40 crc kubenswrapper[4804]: I0217 14:12:40.482825 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-v6fhg" Feb 17 14:12:40 crc kubenswrapper[4804]: I0217 14:12:40.483711 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-v6fhg" Feb 17 14:12:40 crc kubenswrapper[4804]: I0217 14:12:40.585360 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-v6fhg" Feb 17 14:12:41 crc kubenswrapper[4804]: I0217 14:12:41.023006 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-v6fhg" Feb 17 14:12:41 crc kubenswrapper[4804]: I0217 14:12:41.090850 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6fhg"] Feb 17 14:12:42 crc kubenswrapper[4804]: I0217 14:12:42.989252 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-v6fhg" podUID="8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1" containerName="registry-server" containerID="cri-o://6da9349288b511807551c3d38ce707bceb766b6c8772f0dee6d9ba6cac138913" gracePeriod=2 Feb 17 14:12:43 crc kubenswrapper[4804]: I0217 14:12:43.468119 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v6fhg" Feb 17 14:12:43 crc kubenswrapper[4804]: I0217 14:12:43.550551 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ml5d\" (UniqueName: \"kubernetes.io/projected/8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1-kube-api-access-5ml5d\") pod \"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1\" (UID: \"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1\") " Feb 17 14:12:43 crc kubenswrapper[4804]: I0217 14:12:43.550667 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1-utilities\") pod \"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1\" (UID: \"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1\") " Feb 17 14:12:43 crc kubenswrapper[4804]: I0217 14:12:43.550718 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1-catalog-content\") pod \"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1\" (UID: \"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1\") " Feb 17 14:12:43 crc kubenswrapper[4804]: I0217 14:12:43.551302 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1-utilities" (OuterVolumeSpecName: "utilities") pod "8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1" (UID: "8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:12:43 crc kubenswrapper[4804]: I0217 14:12:43.556658 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1-kube-api-access-5ml5d" (OuterVolumeSpecName: "kube-api-access-5ml5d") pod "8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1" (UID: "8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1"). InnerVolumeSpecName "kube-api-access-5ml5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:12:43 crc kubenswrapper[4804]: I0217 14:12:43.573461 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1" (UID: "8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:12:43 crc kubenswrapper[4804]: I0217 14:12:43.653224 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ml5d\" (UniqueName: \"kubernetes.io/projected/8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1-kube-api-access-5ml5d\") on node \"crc\" DevicePath \"\"" Feb 17 14:12:43 crc kubenswrapper[4804]: I0217 14:12:43.653265 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:12:43 crc kubenswrapper[4804]: I0217 14:12:43.653279 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:12:44 crc kubenswrapper[4804]: I0217 14:12:44.005475 4804 generic.go:334] "Generic (PLEG): container finished" podID="8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1" containerID="6da9349288b511807551c3d38ce707bceb766b6c8772f0dee6d9ba6cac138913" exitCode=0 Feb 17 14:12:44 crc kubenswrapper[4804]: I0217 14:12:44.005521 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6fhg" event={"ID":"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1","Type":"ContainerDied","Data":"6da9349288b511807551c3d38ce707bceb766b6c8772f0dee6d9ba6cac138913"} Feb 17 14:12:44 crc kubenswrapper[4804]: I0217 14:12:44.005550 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6fhg" event={"ID":"8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1","Type":"ContainerDied","Data":"fc548041e26d0614531ae99cf8e30f06221c5f4d8be0ee5276ce2c338d7913a8"} Feb 17 14:12:44 crc kubenswrapper[4804]: I0217 14:12:44.005566 4804 scope.go:117] "RemoveContainer" containerID="6da9349288b511807551c3d38ce707bceb766b6c8772f0dee6d9ba6cac138913" Feb 17 14:12:44 crc kubenswrapper[4804]: I0217 14:12:44.005697 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v6fhg" Feb 17 14:12:44 crc kubenswrapper[4804]: I0217 14:12:44.025672 4804 scope.go:117] "RemoveContainer" containerID="cb1ba2ff1c2492a4aa44fe8ec7c0a001fed6d1e45c3c3350c681bd7e9cbf2734" Feb 17 14:12:44 crc kubenswrapper[4804]: I0217 14:12:44.058087 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6fhg"] Feb 17 14:12:44 crc kubenswrapper[4804]: I0217 14:12:44.071046 4804 scope.go:117] "RemoveContainer" containerID="a6e80dd12b575e040cf83707a8f17bbbd44006a931291c56034df5502fa278d3" Feb 17 14:12:44 crc kubenswrapper[4804]: I0217 14:12:44.071063 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6fhg"] Feb 17 14:12:44 crc kubenswrapper[4804]: I0217 14:12:44.104480 4804 scope.go:117] "RemoveContainer" containerID="6da9349288b511807551c3d38ce707bceb766b6c8772f0dee6d9ba6cac138913" Feb 17 14:12:44 crc kubenswrapper[4804]: E0217 14:12:44.105124 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6da9349288b511807551c3d38ce707bceb766b6c8772f0dee6d9ba6cac138913\": container with ID starting with 6da9349288b511807551c3d38ce707bceb766b6c8772f0dee6d9ba6cac138913 not found: ID does not exist" containerID="6da9349288b511807551c3d38ce707bceb766b6c8772f0dee6d9ba6cac138913" Feb 17 14:12:44 crc kubenswrapper[4804]: I0217 14:12:44.105229 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6da9349288b511807551c3d38ce707bceb766b6c8772f0dee6d9ba6cac138913"} err="failed to get container status \"6da9349288b511807551c3d38ce707bceb766b6c8772f0dee6d9ba6cac138913\": rpc error: code = NotFound desc = could not find container \"6da9349288b511807551c3d38ce707bceb766b6c8772f0dee6d9ba6cac138913\": container with ID starting with 6da9349288b511807551c3d38ce707bceb766b6c8772f0dee6d9ba6cac138913 not found: ID does not exist" Feb 17 14:12:44 crc kubenswrapper[4804]: I0217 14:12:44.105291 4804 scope.go:117] "RemoveContainer" containerID="cb1ba2ff1c2492a4aa44fe8ec7c0a001fed6d1e45c3c3350c681bd7e9cbf2734" Feb 17 14:12:44 crc kubenswrapper[4804]: E0217 14:12:44.105754 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb1ba2ff1c2492a4aa44fe8ec7c0a001fed6d1e45c3c3350c681bd7e9cbf2734\": container with ID starting with cb1ba2ff1c2492a4aa44fe8ec7c0a001fed6d1e45c3c3350c681bd7e9cbf2734 not found: ID does not exist" containerID="cb1ba2ff1c2492a4aa44fe8ec7c0a001fed6d1e45c3c3350c681bd7e9cbf2734" Feb 17 14:12:44 crc kubenswrapper[4804]: I0217 14:12:44.105801 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb1ba2ff1c2492a4aa44fe8ec7c0a001fed6d1e45c3c3350c681bd7e9cbf2734"} err="failed to get container status \"cb1ba2ff1c2492a4aa44fe8ec7c0a001fed6d1e45c3c3350c681bd7e9cbf2734\": rpc error: code = NotFound desc = could not find container \"cb1ba2ff1c2492a4aa44fe8ec7c0a001fed6d1e45c3c3350c681bd7e9cbf2734\": container with ID starting with cb1ba2ff1c2492a4aa44fe8ec7c0a001fed6d1e45c3c3350c681bd7e9cbf2734 not found: ID does not exist" Feb 17 14:12:44 crc kubenswrapper[4804]: I0217 14:12:44.105834 4804 scope.go:117] "RemoveContainer" containerID="a6e80dd12b575e040cf83707a8f17bbbd44006a931291c56034df5502fa278d3" Feb 17 14:12:44 crc kubenswrapper[4804]: E0217 14:12:44.106171 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6e80dd12b575e040cf83707a8f17bbbd44006a931291c56034df5502fa278d3\": container with ID starting with a6e80dd12b575e040cf83707a8f17bbbd44006a931291c56034df5502fa278d3 not found: ID does not exist" containerID="a6e80dd12b575e040cf83707a8f17bbbd44006a931291c56034df5502fa278d3" Feb 17 14:12:44 crc kubenswrapper[4804]: I0217 14:12:44.106237 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6e80dd12b575e040cf83707a8f17bbbd44006a931291c56034df5502fa278d3"} err="failed to get container status \"a6e80dd12b575e040cf83707a8f17bbbd44006a931291c56034df5502fa278d3\": rpc error: code = NotFound desc = could not find container \"a6e80dd12b575e040cf83707a8f17bbbd44006a931291c56034df5502fa278d3\": container with ID starting with a6e80dd12b575e040cf83707a8f17bbbd44006a931291c56034df5502fa278d3 not found: ID does not exist" Feb 17 14:12:44 crc kubenswrapper[4804]: I0217 14:12:44.584829 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1" path="/var/lib/kubelet/pods/8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1/volumes" Feb 17 14:12:55 crc kubenswrapper[4804]: I0217 14:12:55.835295 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:12:55 crc kubenswrapper[4804]: I0217 14:12:55.835893 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:13:25 crc kubenswrapper[4804]: I0217 14:13:25.835715 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:13:25 crc kubenswrapper[4804]: I0217 14:13:25.836365 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:13:31 crc kubenswrapper[4804]: I0217 14:13:31.594462 4804 scope.go:117] "RemoveContainer" containerID="63d2f999f3b20ab902c761ba7009b771de4228c739dc86596893270097ddcd50" Feb 17 14:13:31 crc kubenswrapper[4804]: I0217 14:13:31.624323 4804 scope.go:117] "RemoveContainer" containerID="a4f78e134f4672850b3042a62c45a06ef12ddc5102c5309d4d677ecae6fb3127" Feb 17 14:13:31 crc kubenswrapper[4804]: I0217 14:13:31.690546 4804 scope.go:117] "RemoveContainer" containerID="f01193bdf1e82e2a6e3fea01bf103d3eb3adc6c861f2b2ac1692bb87fc0b6c46" Feb 17 14:13:55 crc kubenswrapper[4804]: I0217 14:13:55.835076 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:13:55 crc kubenswrapper[4804]: I0217 14:13:55.835610 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:13:55 crc kubenswrapper[4804]: I0217 14:13:55.835651 4804 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 14:13:55 crc kubenswrapper[4804]: I0217 14:13:55.836355 4804 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3270aad0cb169aa901bc212cfebf2f76e758028c073b7389992ac0738318b723"} pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 14:13:55 crc kubenswrapper[4804]: I0217 14:13:55.836403 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" containerID="cri-o://3270aad0cb169aa901bc212cfebf2f76e758028c073b7389992ac0738318b723" gracePeriod=600 Feb 17 14:13:56 crc kubenswrapper[4804]: I0217 14:13:56.674865 4804 generic.go:334] "Generic (PLEG): container finished" podID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerID="3270aad0cb169aa901bc212cfebf2f76e758028c073b7389992ac0738318b723" exitCode=0 Feb 17 14:13:56 crc kubenswrapper[4804]: I0217 14:13:56.675058 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerDied","Data":"3270aad0cb169aa901bc212cfebf2f76e758028c073b7389992ac0738318b723"} Feb 17 14:13:56 crc kubenswrapper[4804]: I0217 14:13:56.675597 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerStarted","Data":"cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce"} Feb 17 14:13:56 crc kubenswrapper[4804]: I0217 14:13:56.675621 4804 scope.go:117] "RemoveContainer" containerID="f2b1ceb8fb4fad3e569c5b6981a67ebe95c4fab4dc0ebdb126ef709837a1eb69" Feb 17 14:14:08 crc kubenswrapper[4804]: I0217 14:14:08.006606 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bg24h"] Feb 17 14:14:08 crc kubenswrapper[4804]: E0217 14:14:08.016705 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1" containerName="extract-content" Feb 17 14:14:08 crc kubenswrapper[4804]: I0217 14:14:08.016949 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1" containerName="extract-content" Feb 17 14:14:08 crc kubenswrapper[4804]: E0217 14:14:08.017068 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1" containerName="extract-utilities" Feb 17 14:14:08 crc kubenswrapper[4804]: I0217 14:14:08.017158 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1" containerName="extract-utilities" Feb 17 14:14:08 crc kubenswrapper[4804]: E0217 14:14:08.017291 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1" containerName="registry-server" Feb 17 14:14:08 crc kubenswrapper[4804]: I0217 14:14:08.017380 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1" containerName="registry-server" Feb 17 14:14:08 crc kubenswrapper[4804]: I0217 14:14:08.017755 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c4fc3c8-f1f3-49c0-9017-d6f8d6f5dcb1" containerName="registry-server" Feb 17 14:14:08 crc kubenswrapper[4804]: I0217 14:14:08.019917 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bg24h" Feb 17 14:14:08 crc kubenswrapper[4804]: I0217 14:14:08.035136 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bg24h"] Feb 17 14:14:08 crc kubenswrapper[4804]: I0217 14:14:08.187584 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea983551-05ac-4386-8afb-4c1e289de6bd-utilities\") pod \"community-operators-bg24h\" (UID: \"ea983551-05ac-4386-8afb-4c1e289de6bd\") " pod="openshift-marketplace/community-operators-bg24h" Feb 17 14:14:08 crc kubenswrapper[4804]: I0217 14:14:08.188594 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c56fp\" (UniqueName: \"kubernetes.io/projected/ea983551-05ac-4386-8afb-4c1e289de6bd-kube-api-access-c56fp\") pod \"community-operators-bg24h\" (UID: \"ea983551-05ac-4386-8afb-4c1e289de6bd\") " pod="openshift-marketplace/community-operators-bg24h" Feb 17 14:14:08 crc kubenswrapper[4804]: I0217 14:14:08.188648 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea983551-05ac-4386-8afb-4c1e289de6bd-catalog-content\") pod \"community-operators-bg24h\" (UID: \"ea983551-05ac-4386-8afb-4c1e289de6bd\") " pod="openshift-marketplace/community-operators-bg24h" Feb 17 14:14:08 crc kubenswrapper[4804]: I0217 14:14:08.290322 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea983551-05ac-4386-8afb-4c1e289de6bd-utilities\") pod \"community-operators-bg24h\" (UID: \"ea983551-05ac-4386-8afb-4c1e289de6bd\") " pod="openshift-marketplace/community-operators-bg24h" Feb 17 14:14:08 crc kubenswrapper[4804]: I0217 14:14:08.290455 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c56fp\" (UniqueName: \"kubernetes.io/projected/ea983551-05ac-4386-8afb-4c1e289de6bd-kube-api-access-c56fp\") pod \"community-operators-bg24h\" (UID: \"ea983551-05ac-4386-8afb-4c1e289de6bd\") " pod="openshift-marketplace/community-operators-bg24h" Feb 17 14:14:08 crc kubenswrapper[4804]: I0217 14:14:08.290509 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea983551-05ac-4386-8afb-4c1e289de6bd-catalog-content\") pod \"community-operators-bg24h\" (UID: \"ea983551-05ac-4386-8afb-4c1e289de6bd\") " pod="openshift-marketplace/community-operators-bg24h" Feb 17 14:14:08 crc kubenswrapper[4804]: I0217 14:14:08.290890 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea983551-05ac-4386-8afb-4c1e289de6bd-utilities\") pod \"community-operators-bg24h\" (UID: \"ea983551-05ac-4386-8afb-4c1e289de6bd\") " pod="openshift-marketplace/community-operators-bg24h" Feb 17 14:14:08 crc kubenswrapper[4804]: I0217 14:14:08.291237 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea983551-05ac-4386-8afb-4c1e289de6bd-catalog-content\") pod \"community-operators-bg24h\" (UID: \"ea983551-05ac-4386-8afb-4c1e289de6bd\") " pod="openshift-marketplace/community-operators-bg24h" Feb 17 14:14:08 crc kubenswrapper[4804]: I0217 14:14:08.320884 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c56fp\" (UniqueName: \"kubernetes.io/projected/ea983551-05ac-4386-8afb-4c1e289de6bd-kube-api-access-c56fp\") pod \"community-operators-bg24h\" (UID: \"ea983551-05ac-4386-8afb-4c1e289de6bd\") " pod="openshift-marketplace/community-operators-bg24h" Feb 17 14:14:08 crc kubenswrapper[4804]: I0217 14:14:08.362147 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bg24h" Feb 17 14:14:08 crc kubenswrapper[4804]: I0217 14:14:08.894423 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bg24h"] Feb 17 14:14:09 crc kubenswrapper[4804]: I0217 14:14:09.792516 4804 generic.go:334] "Generic (PLEG): container finished" podID="ea983551-05ac-4386-8afb-4c1e289de6bd" containerID="ea87c1c20e3ee1bc10d2928dbe764d8d8ebee9c61ba8fd1bfd54a3df2486e65c" exitCode=0 Feb 17 14:14:09 crc kubenswrapper[4804]: I0217 14:14:09.792593 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bg24h" event={"ID":"ea983551-05ac-4386-8afb-4c1e289de6bd","Type":"ContainerDied","Data":"ea87c1c20e3ee1bc10d2928dbe764d8d8ebee9c61ba8fd1bfd54a3df2486e65c"} Feb 17 14:14:09 crc kubenswrapper[4804]: I0217 14:14:09.792909 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bg24h" event={"ID":"ea983551-05ac-4386-8afb-4c1e289de6bd","Type":"ContainerStarted","Data":"22b93f76de275c61d7af6af439b9be25047ded458855de738f24cea5fd962af2"} Feb 17 14:14:10 crc kubenswrapper[4804]: I0217 14:14:10.803810 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bg24h" event={"ID":"ea983551-05ac-4386-8afb-4c1e289de6bd","Type":"ContainerStarted","Data":"bc1cdc0059fa9bb83c9a6a706d533a235f3b8ee47c33e63cc295960aaa40992a"} Feb 17 14:14:11 crc kubenswrapper[4804]: I0217 14:14:11.816740 4804 generic.go:334] "Generic (PLEG): container finished" podID="ea983551-05ac-4386-8afb-4c1e289de6bd" containerID="bc1cdc0059fa9bb83c9a6a706d533a235f3b8ee47c33e63cc295960aaa40992a" exitCode=0 Feb 17 14:14:11 crc kubenswrapper[4804]: I0217 14:14:11.816850 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bg24h" event={"ID":"ea983551-05ac-4386-8afb-4c1e289de6bd","Type":"ContainerDied","Data":"bc1cdc0059fa9bb83c9a6a706d533a235f3b8ee47c33e63cc295960aaa40992a"} Feb 17 14:14:12 crc kubenswrapper[4804]: I0217 14:14:12.839010 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bg24h" event={"ID":"ea983551-05ac-4386-8afb-4c1e289de6bd","Type":"ContainerStarted","Data":"2342f8c6451398d66f181de8a5d3b29544ae9318989c33a33b3ddfd7cee4235d"} Feb 17 14:14:12 crc kubenswrapper[4804]: I0217 14:14:12.859328 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bg24h" podStartSLOduration=3.407514602 podStartE2EDuration="5.859306179s" podCreationTimestamp="2026-02-17 14:14:07 +0000 UTC" firstStartedPulling="2026-02-17 14:14:09.795505748 +0000 UTC m=+2923.906925105" lastFinishedPulling="2026-02-17 14:14:12.247297345 +0000 UTC m=+2926.358716682" observedRunningTime="2026-02-17 14:14:12.856961446 +0000 UTC m=+2926.968380793" watchObservedRunningTime="2026-02-17 14:14:12.859306179 +0000 UTC m=+2926.970725506" Feb 17 14:14:18 crc kubenswrapper[4804]: I0217 14:14:18.362698 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bg24h" Feb 17 14:14:18 crc kubenswrapper[4804]: I0217 14:14:18.363339 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bg24h" Feb 17 14:14:18 crc kubenswrapper[4804]: I0217 14:14:18.410716 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bg24h" Feb 17 14:14:18 crc kubenswrapper[4804]: I0217 14:14:18.940399 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bg24h" Feb 17 14:14:18 crc kubenswrapper[4804]: I0217 14:14:18.994106 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bg24h"] Feb 17 14:14:20 crc kubenswrapper[4804]: I0217 14:14:20.913290 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bg24h" podUID="ea983551-05ac-4386-8afb-4c1e289de6bd" containerName="registry-server" containerID="cri-o://2342f8c6451398d66f181de8a5d3b29544ae9318989c33a33b3ddfd7cee4235d" gracePeriod=2 Feb 17 14:14:21 crc kubenswrapper[4804]: I0217 14:14:21.407783 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bg24h" Feb 17 14:14:21 crc kubenswrapper[4804]: I0217 14:14:21.438379 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c56fp\" (UniqueName: \"kubernetes.io/projected/ea983551-05ac-4386-8afb-4c1e289de6bd-kube-api-access-c56fp\") pod \"ea983551-05ac-4386-8afb-4c1e289de6bd\" (UID: \"ea983551-05ac-4386-8afb-4c1e289de6bd\") " Feb 17 14:14:21 crc kubenswrapper[4804]: I0217 14:14:21.438690 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea983551-05ac-4386-8afb-4c1e289de6bd-utilities\") pod \"ea983551-05ac-4386-8afb-4c1e289de6bd\" (UID: \"ea983551-05ac-4386-8afb-4c1e289de6bd\") " Feb 17 14:14:21 crc kubenswrapper[4804]: I0217 14:14:21.438917 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea983551-05ac-4386-8afb-4c1e289de6bd-catalog-content\") pod \"ea983551-05ac-4386-8afb-4c1e289de6bd\" (UID: \"ea983551-05ac-4386-8afb-4c1e289de6bd\") " Feb 17 14:14:21 crc kubenswrapper[4804]: I0217 14:14:21.445500 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea983551-05ac-4386-8afb-4c1e289de6bd-kube-api-access-c56fp" (OuterVolumeSpecName: "kube-api-access-c56fp") pod "ea983551-05ac-4386-8afb-4c1e289de6bd" (UID: "ea983551-05ac-4386-8afb-4c1e289de6bd"). InnerVolumeSpecName "kube-api-access-c56fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:14:21 crc kubenswrapper[4804]: I0217 14:14:21.448008 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea983551-05ac-4386-8afb-4c1e289de6bd-utilities" (OuterVolumeSpecName: "utilities") pod "ea983551-05ac-4386-8afb-4c1e289de6bd" (UID: "ea983551-05ac-4386-8afb-4c1e289de6bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:14:21 crc kubenswrapper[4804]: I0217 14:14:21.502588 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea983551-05ac-4386-8afb-4c1e289de6bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ea983551-05ac-4386-8afb-4c1e289de6bd" (UID: "ea983551-05ac-4386-8afb-4c1e289de6bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:14:21 crc kubenswrapper[4804]: I0217 14:14:21.541163 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c56fp\" (UniqueName: \"kubernetes.io/projected/ea983551-05ac-4386-8afb-4c1e289de6bd-kube-api-access-c56fp\") on node \"crc\" DevicePath \"\"" Feb 17 14:14:21 crc kubenswrapper[4804]: I0217 14:14:21.541218 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea983551-05ac-4386-8afb-4c1e289de6bd-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:14:21 crc kubenswrapper[4804]: I0217 14:14:21.541233 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea983551-05ac-4386-8afb-4c1e289de6bd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:14:21 crc kubenswrapper[4804]: I0217 14:14:21.926334 4804 generic.go:334] "Generic (PLEG): container finished" podID="ea983551-05ac-4386-8afb-4c1e289de6bd" containerID="2342f8c6451398d66f181de8a5d3b29544ae9318989c33a33b3ddfd7cee4235d" exitCode=0 Feb 17 14:14:21 crc kubenswrapper[4804]: I0217 14:14:21.926402 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bg24h" event={"ID":"ea983551-05ac-4386-8afb-4c1e289de6bd","Type":"ContainerDied","Data":"2342f8c6451398d66f181de8a5d3b29544ae9318989c33a33b3ddfd7cee4235d"} Feb 17 14:14:21 crc kubenswrapper[4804]: I0217 14:14:21.926445 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bg24h" event={"ID":"ea983551-05ac-4386-8afb-4c1e289de6bd","Type":"ContainerDied","Data":"22b93f76de275c61d7af6af439b9be25047ded458855de738f24cea5fd962af2"} Feb 17 14:14:21 crc kubenswrapper[4804]: I0217 14:14:21.926474 4804 scope.go:117] "RemoveContainer" containerID="2342f8c6451398d66f181de8a5d3b29544ae9318989c33a33b3ddfd7cee4235d" Feb 17 14:14:21 crc kubenswrapper[4804]: I0217 14:14:21.926648 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bg24h" Feb 17 14:14:21 crc kubenswrapper[4804]: I0217 14:14:21.951640 4804 scope.go:117] "RemoveContainer" containerID="bc1cdc0059fa9bb83c9a6a706d533a235f3b8ee47c33e63cc295960aaa40992a" Feb 17 14:14:21 crc kubenswrapper[4804]: I0217 14:14:21.985502 4804 scope.go:117] "RemoveContainer" containerID="ea87c1c20e3ee1bc10d2928dbe764d8d8ebee9c61ba8fd1bfd54a3df2486e65c" Feb 17 14:14:21 crc kubenswrapper[4804]: I0217 14:14:21.986118 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bg24h"] Feb 17 14:14:21 crc kubenswrapper[4804]: I0217 14:14:21.994279 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bg24h"] Feb 17 14:14:22 crc kubenswrapper[4804]: I0217 14:14:22.031840 4804 scope.go:117] "RemoveContainer" containerID="2342f8c6451398d66f181de8a5d3b29544ae9318989c33a33b3ddfd7cee4235d" Feb 17 14:14:22 crc kubenswrapper[4804]: E0217 14:14:22.032468 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2342f8c6451398d66f181de8a5d3b29544ae9318989c33a33b3ddfd7cee4235d\": container with ID starting with 2342f8c6451398d66f181de8a5d3b29544ae9318989c33a33b3ddfd7cee4235d not found: ID does not exist" containerID="2342f8c6451398d66f181de8a5d3b29544ae9318989c33a33b3ddfd7cee4235d" Feb 17 14:14:22 crc kubenswrapper[4804]: I0217 14:14:22.032620 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2342f8c6451398d66f181de8a5d3b29544ae9318989c33a33b3ddfd7cee4235d"} err="failed to get container status \"2342f8c6451398d66f181de8a5d3b29544ae9318989c33a33b3ddfd7cee4235d\": rpc error: code = NotFound desc = could not find container \"2342f8c6451398d66f181de8a5d3b29544ae9318989c33a33b3ddfd7cee4235d\": container with ID starting with 2342f8c6451398d66f181de8a5d3b29544ae9318989c33a33b3ddfd7cee4235d not found: ID does not exist" Feb 17 14:14:22 crc kubenswrapper[4804]: I0217 14:14:22.032757 4804 scope.go:117] "RemoveContainer" containerID="bc1cdc0059fa9bb83c9a6a706d533a235f3b8ee47c33e63cc295960aaa40992a" Feb 17 14:14:22 crc kubenswrapper[4804]: E0217 14:14:22.033430 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc1cdc0059fa9bb83c9a6a706d533a235f3b8ee47c33e63cc295960aaa40992a\": container with ID starting with bc1cdc0059fa9bb83c9a6a706d533a235f3b8ee47c33e63cc295960aaa40992a not found: ID does not exist" containerID="bc1cdc0059fa9bb83c9a6a706d533a235f3b8ee47c33e63cc295960aaa40992a" Feb 17 14:14:22 crc kubenswrapper[4804]: I0217 14:14:22.033501 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc1cdc0059fa9bb83c9a6a706d533a235f3b8ee47c33e63cc295960aaa40992a"} err="failed to get container status \"bc1cdc0059fa9bb83c9a6a706d533a235f3b8ee47c33e63cc295960aaa40992a\": rpc error: code = NotFound desc = could not find container \"bc1cdc0059fa9bb83c9a6a706d533a235f3b8ee47c33e63cc295960aaa40992a\": container with ID starting with bc1cdc0059fa9bb83c9a6a706d533a235f3b8ee47c33e63cc295960aaa40992a not found: ID does not exist" Feb 17 14:14:22 crc kubenswrapper[4804]: I0217 14:14:22.033538 4804 scope.go:117] "RemoveContainer" containerID="ea87c1c20e3ee1bc10d2928dbe764d8d8ebee9c61ba8fd1bfd54a3df2486e65c" Feb 17 14:14:22 crc kubenswrapper[4804]: E0217 14:14:22.033919 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea87c1c20e3ee1bc10d2928dbe764d8d8ebee9c61ba8fd1bfd54a3df2486e65c\": container with ID starting with ea87c1c20e3ee1bc10d2928dbe764d8d8ebee9c61ba8fd1bfd54a3df2486e65c not found: ID does not exist" containerID="ea87c1c20e3ee1bc10d2928dbe764d8d8ebee9c61ba8fd1bfd54a3df2486e65c" Feb 17 14:14:22 crc kubenswrapper[4804]: I0217 14:14:22.034008 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea87c1c20e3ee1bc10d2928dbe764d8d8ebee9c61ba8fd1bfd54a3df2486e65c"} err="failed to get container status \"ea87c1c20e3ee1bc10d2928dbe764d8d8ebee9c61ba8fd1bfd54a3df2486e65c\": rpc error: code = NotFound desc = could not find container \"ea87c1c20e3ee1bc10d2928dbe764d8d8ebee9c61ba8fd1bfd54a3df2486e65c\": container with ID starting with ea87c1c20e3ee1bc10d2928dbe764d8d8ebee9c61ba8fd1bfd54a3df2486e65c not found: ID does not exist" Feb 17 14:14:22 crc kubenswrapper[4804]: I0217 14:14:22.594871 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea983551-05ac-4386-8afb-4c1e289de6bd" path="/var/lib/kubelet/pods/ea983551-05ac-4386-8afb-4c1e289de6bd/volumes" Feb 17 14:15:00 crc kubenswrapper[4804]: I0217 14:15:00.150708 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522295-lptcp"] Feb 17 14:15:00 crc kubenswrapper[4804]: E0217 14:15:00.152857 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea983551-05ac-4386-8afb-4c1e289de6bd" containerName="extract-utilities" Feb 17 14:15:00 crc kubenswrapper[4804]: I0217 14:15:00.152953 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea983551-05ac-4386-8afb-4c1e289de6bd" containerName="extract-utilities" Feb 17 14:15:00 crc kubenswrapper[4804]: E0217 14:15:00.153034 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea983551-05ac-4386-8afb-4c1e289de6bd" containerName="registry-server" Feb 17 14:15:00 crc kubenswrapper[4804]: I0217 14:15:00.153122 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea983551-05ac-4386-8afb-4c1e289de6bd" containerName="registry-server" Feb 17 14:15:00 crc kubenswrapper[4804]: E0217 14:15:00.153193 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea983551-05ac-4386-8afb-4c1e289de6bd" containerName="extract-content" Feb 17 14:15:00 crc kubenswrapper[4804]: I0217 14:15:00.153271 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea983551-05ac-4386-8afb-4c1e289de6bd" containerName="extract-content" Feb 17 14:15:00 crc kubenswrapper[4804]: I0217 14:15:00.153643 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea983551-05ac-4386-8afb-4c1e289de6bd" containerName="registry-server" Feb 17 14:15:00 crc kubenswrapper[4804]: I0217 14:15:00.154490 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-lptcp" Feb 17 14:15:00 crc kubenswrapper[4804]: I0217 14:15:00.159691 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 14:15:00 crc kubenswrapper[4804]: I0217 14:15:00.160022 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 14:15:00 crc kubenswrapper[4804]: I0217 14:15:00.168264 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522295-lptcp"] Feb 17 14:15:00 crc kubenswrapper[4804]: I0217 14:15:00.248249 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlj9q\" (UniqueName: \"kubernetes.io/projected/c3e8a4e9-ee0a-4283-835f-de5a54c8136d-kube-api-access-qlj9q\") pod \"collect-profiles-29522295-lptcp\" (UID: \"c3e8a4e9-ee0a-4283-835f-de5a54c8136d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-lptcp" Feb 17 14:15:00 crc kubenswrapper[4804]: I0217 14:15:00.248651 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3e8a4e9-ee0a-4283-835f-de5a54c8136d-config-volume\") pod \"collect-profiles-29522295-lptcp\" (UID: \"c3e8a4e9-ee0a-4283-835f-de5a54c8136d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-lptcp" Feb 17 14:15:00 crc kubenswrapper[4804]: I0217 14:15:00.248812 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c3e8a4e9-ee0a-4283-835f-de5a54c8136d-secret-volume\") pod \"collect-profiles-29522295-lptcp\" (UID: \"c3e8a4e9-ee0a-4283-835f-de5a54c8136d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-lptcp" Feb 17 14:15:00 crc kubenswrapper[4804]: I0217 14:15:00.350600 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlj9q\" (UniqueName: \"kubernetes.io/projected/c3e8a4e9-ee0a-4283-835f-de5a54c8136d-kube-api-access-qlj9q\") pod \"collect-profiles-29522295-lptcp\" (UID: \"c3e8a4e9-ee0a-4283-835f-de5a54c8136d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-lptcp" Feb 17 14:15:00 crc kubenswrapper[4804]: I0217 14:15:00.350647 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3e8a4e9-ee0a-4283-835f-de5a54c8136d-config-volume\") pod \"collect-profiles-29522295-lptcp\" (UID: \"c3e8a4e9-ee0a-4283-835f-de5a54c8136d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-lptcp" Feb 17 14:15:00 crc kubenswrapper[4804]: I0217 14:15:00.350723 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c3e8a4e9-ee0a-4283-835f-de5a54c8136d-secret-volume\") pod \"collect-profiles-29522295-lptcp\" (UID: \"c3e8a4e9-ee0a-4283-835f-de5a54c8136d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-lptcp" Feb 17 14:15:00 crc kubenswrapper[4804]: I0217 14:15:00.352080 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3e8a4e9-ee0a-4283-835f-de5a54c8136d-config-volume\") pod \"collect-profiles-29522295-lptcp\" (UID: \"c3e8a4e9-ee0a-4283-835f-de5a54c8136d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-lptcp" Feb 17 14:15:00 crc kubenswrapper[4804]: I0217 14:15:00.360240 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c3e8a4e9-ee0a-4283-835f-de5a54c8136d-secret-volume\") pod \"collect-profiles-29522295-lptcp\" (UID: \"c3e8a4e9-ee0a-4283-835f-de5a54c8136d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-lptcp" Feb 17 14:15:00 crc kubenswrapper[4804]: I0217 14:15:00.367395 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlj9q\" (UniqueName: \"kubernetes.io/projected/c3e8a4e9-ee0a-4283-835f-de5a54c8136d-kube-api-access-qlj9q\") pod \"collect-profiles-29522295-lptcp\" (UID: \"c3e8a4e9-ee0a-4283-835f-de5a54c8136d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-lptcp" Feb 17 14:15:00 crc kubenswrapper[4804]: I0217 14:15:00.473014 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-lptcp" Feb 17 14:15:00 crc kubenswrapper[4804]: I0217 14:15:00.939028 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522295-lptcp"] Feb 17 14:15:01 crc kubenswrapper[4804]: I0217 14:15:01.637077 4804 generic.go:334] "Generic (PLEG): container finished" podID="c3e8a4e9-ee0a-4283-835f-de5a54c8136d" containerID="377a1a93986753ac71ee083bd27b66be5e6cf98f0f0c8284bcdc9bdc6c8b8e33" exitCode=0 Feb 17 14:15:01 crc kubenswrapper[4804]: I0217 14:15:01.637136 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-lptcp" event={"ID":"c3e8a4e9-ee0a-4283-835f-de5a54c8136d","Type":"ContainerDied","Data":"377a1a93986753ac71ee083bd27b66be5e6cf98f0f0c8284bcdc9bdc6c8b8e33"} Feb 17 14:15:01 crc kubenswrapper[4804]: I0217 14:15:01.637412 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-lptcp" event={"ID":"c3e8a4e9-ee0a-4283-835f-de5a54c8136d","Type":"ContainerStarted","Data":"d3ac1606b3844e4108d0ef4f9f435f220a5046c56eb0032070247a6cd4bbd90b"} Feb 17 14:15:03 crc kubenswrapper[4804]: I0217 14:15:03.012455 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-lptcp" Feb 17 14:15:03 crc kubenswrapper[4804]: I0217 14:15:03.102989 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c3e8a4e9-ee0a-4283-835f-de5a54c8136d-secret-volume\") pod \"c3e8a4e9-ee0a-4283-835f-de5a54c8136d\" (UID: \"c3e8a4e9-ee0a-4283-835f-de5a54c8136d\") " Feb 17 14:15:03 crc kubenswrapper[4804]: I0217 14:15:03.104292 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3e8a4e9-ee0a-4283-835f-de5a54c8136d-config-volume\") pod \"c3e8a4e9-ee0a-4283-835f-de5a54c8136d\" (UID: \"c3e8a4e9-ee0a-4283-835f-de5a54c8136d\") " Feb 17 14:15:03 crc kubenswrapper[4804]: I0217 14:15:03.104374 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlj9q\" (UniqueName: \"kubernetes.io/projected/c3e8a4e9-ee0a-4283-835f-de5a54c8136d-kube-api-access-qlj9q\") pod \"c3e8a4e9-ee0a-4283-835f-de5a54c8136d\" (UID: \"c3e8a4e9-ee0a-4283-835f-de5a54c8136d\") " Feb 17 14:15:03 crc kubenswrapper[4804]: I0217 14:15:03.105153 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3e8a4e9-ee0a-4283-835f-de5a54c8136d-config-volume" (OuterVolumeSpecName: "config-volume") pod "c3e8a4e9-ee0a-4283-835f-de5a54c8136d" (UID: "c3e8a4e9-ee0a-4283-835f-de5a54c8136d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:15:03 crc kubenswrapper[4804]: I0217 14:15:03.105551 4804 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3e8a4e9-ee0a-4283-835f-de5a54c8136d-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 14:15:03 crc kubenswrapper[4804]: I0217 14:15:03.109783 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3e8a4e9-ee0a-4283-835f-de5a54c8136d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c3e8a4e9-ee0a-4283-835f-de5a54c8136d" (UID: "c3e8a4e9-ee0a-4283-835f-de5a54c8136d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:15:03 crc kubenswrapper[4804]: I0217 14:15:03.110240 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3e8a4e9-ee0a-4283-835f-de5a54c8136d-kube-api-access-qlj9q" (OuterVolumeSpecName: "kube-api-access-qlj9q") pod "c3e8a4e9-ee0a-4283-835f-de5a54c8136d" (UID: "c3e8a4e9-ee0a-4283-835f-de5a54c8136d"). InnerVolumeSpecName "kube-api-access-qlj9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:15:03 crc kubenswrapper[4804]: I0217 14:15:03.207284 4804 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c3e8a4e9-ee0a-4283-835f-de5a54c8136d-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 14:15:03 crc kubenswrapper[4804]: I0217 14:15:03.207324 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlj9q\" (UniqueName: \"kubernetes.io/projected/c3e8a4e9-ee0a-4283-835f-de5a54c8136d-kube-api-access-qlj9q\") on node \"crc\" DevicePath \"\"" Feb 17 14:15:03 crc kubenswrapper[4804]: I0217 14:15:03.653971 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-lptcp" event={"ID":"c3e8a4e9-ee0a-4283-835f-de5a54c8136d","Type":"ContainerDied","Data":"d3ac1606b3844e4108d0ef4f9f435f220a5046c56eb0032070247a6cd4bbd90b"} Feb 17 14:15:03 crc kubenswrapper[4804]: I0217 14:15:03.654252 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3ac1606b3844e4108d0ef4f9f435f220a5046c56eb0032070247a6cd4bbd90b" Feb 17 14:15:03 crc kubenswrapper[4804]: I0217 14:15:03.654003 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-lptcp" Feb 17 14:15:04 crc kubenswrapper[4804]: I0217 14:15:04.087614 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9"] Feb 17 14:15:04 crc kubenswrapper[4804]: I0217 14:15:04.096305 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522250-ddtb9"] Feb 17 14:15:04 crc kubenswrapper[4804]: I0217 14:15:04.584945 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9f0ac4b-5b59-4ff9-92ba-54668fffef27" path="/var/lib/kubelet/pods/f9f0ac4b-5b59-4ff9-92ba-54668fffef27/volumes" Feb 17 14:15:31 crc kubenswrapper[4804]: I0217 14:15:31.821840 4804 scope.go:117] "RemoveContainer" containerID="c63647c4f782e7514611e89775cb3101cab0f160b6675c0b2e9972791cd22306" Feb 17 14:15:38 crc kubenswrapper[4804]: I0217 14:15:38.707039 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mhl95"] Feb 17 14:15:38 crc kubenswrapper[4804]: E0217 14:15:38.708734 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3e8a4e9-ee0a-4283-835f-de5a54c8136d" containerName="collect-profiles" Feb 17 14:15:38 crc kubenswrapper[4804]: I0217 14:15:38.708755 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3e8a4e9-ee0a-4283-835f-de5a54c8136d" containerName="collect-profiles" Feb 17 14:15:38 crc kubenswrapper[4804]: I0217 14:15:38.708975 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3e8a4e9-ee0a-4283-835f-de5a54c8136d" containerName="collect-profiles" Feb 17 14:15:38 crc kubenswrapper[4804]: I0217 14:15:38.712029 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mhl95" Feb 17 14:15:38 crc kubenswrapper[4804]: I0217 14:15:38.724041 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mhl95"] Feb 17 14:15:38 crc kubenswrapper[4804]: I0217 14:15:38.787162 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66512154-e5a4-4d46-9d1b-a091a9f9631d-catalog-content\") pod \"certified-operators-mhl95\" (UID: \"66512154-e5a4-4d46-9d1b-a091a9f9631d\") " pod="openshift-marketplace/certified-operators-mhl95" Feb 17 14:15:38 crc kubenswrapper[4804]: I0217 14:15:38.787820 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66512154-e5a4-4d46-9d1b-a091a9f9631d-utilities\") pod \"certified-operators-mhl95\" (UID: \"66512154-e5a4-4d46-9d1b-a091a9f9631d\") " pod="openshift-marketplace/certified-operators-mhl95" Feb 17 14:15:38 crc kubenswrapper[4804]: I0217 14:15:38.787947 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hj8g\" (UniqueName: \"kubernetes.io/projected/66512154-e5a4-4d46-9d1b-a091a9f9631d-kube-api-access-7hj8g\") pod \"certified-operators-mhl95\" (UID: \"66512154-e5a4-4d46-9d1b-a091a9f9631d\") " pod="openshift-marketplace/certified-operators-mhl95" Feb 17 14:15:38 crc kubenswrapper[4804]: I0217 14:15:38.889644 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66512154-e5a4-4d46-9d1b-a091a9f9631d-utilities\") pod \"certified-operators-mhl95\" (UID: \"66512154-e5a4-4d46-9d1b-a091a9f9631d\") " pod="openshift-marketplace/certified-operators-mhl95" Feb 17 14:15:38 crc kubenswrapper[4804]: I0217 14:15:38.889745 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hj8g\" (UniqueName: \"kubernetes.io/projected/66512154-e5a4-4d46-9d1b-a091a9f9631d-kube-api-access-7hj8g\") pod \"certified-operators-mhl95\" (UID: \"66512154-e5a4-4d46-9d1b-a091a9f9631d\") " pod="openshift-marketplace/certified-operators-mhl95" Feb 17 14:15:38 crc kubenswrapper[4804]: I0217 14:15:38.889875 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66512154-e5a4-4d46-9d1b-a091a9f9631d-catalog-content\") pod \"certified-operators-mhl95\" (UID: \"66512154-e5a4-4d46-9d1b-a091a9f9631d\") " pod="openshift-marketplace/certified-operators-mhl95" Feb 17 14:15:38 crc kubenswrapper[4804]: I0217 14:15:38.890144 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66512154-e5a4-4d46-9d1b-a091a9f9631d-utilities\") pod \"certified-operators-mhl95\" (UID: \"66512154-e5a4-4d46-9d1b-a091a9f9631d\") " pod="openshift-marketplace/certified-operators-mhl95" Feb 17 14:15:38 crc kubenswrapper[4804]: I0217 14:15:38.890307 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66512154-e5a4-4d46-9d1b-a091a9f9631d-catalog-content\") pod \"certified-operators-mhl95\" (UID: \"66512154-e5a4-4d46-9d1b-a091a9f9631d\") " pod="openshift-marketplace/certified-operators-mhl95" Feb 17 14:15:38 crc kubenswrapper[4804]: I0217 14:15:38.918097 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hj8g\" (UniqueName: \"kubernetes.io/projected/66512154-e5a4-4d46-9d1b-a091a9f9631d-kube-api-access-7hj8g\") pod \"certified-operators-mhl95\" (UID: \"66512154-e5a4-4d46-9d1b-a091a9f9631d\") " pod="openshift-marketplace/certified-operators-mhl95" Feb 17 14:15:39 crc kubenswrapper[4804]: I0217 14:15:39.053849 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mhl95" Feb 17 14:15:39 crc kubenswrapper[4804]: I0217 14:15:39.553669 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mhl95"] Feb 17 14:15:39 crc kubenswrapper[4804]: I0217 14:15:39.987512 4804 generic.go:334] "Generic (PLEG): container finished" podID="66512154-e5a4-4d46-9d1b-a091a9f9631d" containerID="9dfdcb0d9f1f2a6e6fee92b77d55f2bcf82e3cb7aa6edcaeca0ef57b91dae36a" exitCode=0 Feb 17 14:15:39 crc kubenswrapper[4804]: I0217 14:15:39.987637 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhl95" event={"ID":"66512154-e5a4-4d46-9d1b-a091a9f9631d","Type":"ContainerDied","Data":"9dfdcb0d9f1f2a6e6fee92b77d55f2bcf82e3cb7aa6edcaeca0ef57b91dae36a"} Feb 17 14:15:39 crc kubenswrapper[4804]: I0217 14:15:39.987837 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhl95" event={"ID":"66512154-e5a4-4d46-9d1b-a091a9f9631d","Type":"ContainerStarted","Data":"75a2dce20291625dd97df834d1901fc5bd4bff2bee391da393378fee4ed223cb"} Feb 17 14:15:42 crc kubenswrapper[4804]: I0217 14:15:42.013086 4804 generic.go:334] "Generic (PLEG): container finished" podID="66512154-e5a4-4d46-9d1b-a091a9f9631d" containerID="fc7a42701f50553fc13c96957a701885c01b685335b98aadf5b0ab6df73eda0d" exitCode=0 Feb 17 14:15:42 crc kubenswrapper[4804]: I0217 14:15:42.013183 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhl95" event={"ID":"66512154-e5a4-4d46-9d1b-a091a9f9631d","Type":"ContainerDied","Data":"fc7a42701f50553fc13c96957a701885c01b685335b98aadf5b0ab6df73eda0d"} Feb 17 14:15:43 crc kubenswrapper[4804]: I0217 14:15:43.024155 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhl95" event={"ID":"66512154-e5a4-4d46-9d1b-a091a9f9631d","Type":"ContainerStarted","Data":"18a87c38e30d94a34dbec876b7db21190f53b3ef286e038634bac0605fb31ffa"} Feb 17 14:15:43 crc kubenswrapper[4804]: I0217 14:15:43.044914 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mhl95" podStartSLOduration=2.524710135 podStartE2EDuration="5.044897066s" podCreationTimestamp="2026-02-17 14:15:38 +0000 UTC" firstStartedPulling="2026-02-17 14:15:39.98924274 +0000 UTC m=+3014.100662077" lastFinishedPulling="2026-02-17 14:15:42.509429671 +0000 UTC m=+3016.620849008" observedRunningTime="2026-02-17 14:15:43.041694665 +0000 UTC m=+3017.153114002" watchObservedRunningTime="2026-02-17 14:15:43.044897066 +0000 UTC m=+3017.156316393" Feb 17 14:15:49 crc kubenswrapper[4804]: I0217 14:15:49.054016 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mhl95" Feb 17 14:15:49 crc kubenswrapper[4804]: I0217 14:15:49.054666 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mhl95" Feb 17 14:15:49 crc kubenswrapper[4804]: I0217 14:15:49.106675 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mhl95" Feb 17 14:15:49 crc kubenswrapper[4804]: I0217 14:15:49.153745 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mhl95" Feb 17 14:15:49 crc kubenswrapper[4804]: I0217 14:15:49.348115 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mhl95"] Feb 17 14:15:51 crc kubenswrapper[4804]: I0217 14:15:51.089098 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mhl95" podUID="66512154-e5a4-4d46-9d1b-a091a9f9631d" containerName="registry-server" containerID="cri-o://18a87c38e30d94a34dbec876b7db21190f53b3ef286e038634bac0605fb31ffa" gracePeriod=2 Feb 17 14:15:51 crc kubenswrapper[4804]: I0217 14:15:51.585893 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mhl95" Feb 17 14:15:51 crc kubenswrapper[4804]: I0217 14:15:51.732413 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66512154-e5a4-4d46-9d1b-a091a9f9631d-utilities\") pod \"66512154-e5a4-4d46-9d1b-a091a9f9631d\" (UID: \"66512154-e5a4-4d46-9d1b-a091a9f9631d\") " Feb 17 14:15:51 crc kubenswrapper[4804]: I0217 14:15:51.732477 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hj8g\" (UniqueName: \"kubernetes.io/projected/66512154-e5a4-4d46-9d1b-a091a9f9631d-kube-api-access-7hj8g\") pod \"66512154-e5a4-4d46-9d1b-a091a9f9631d\" (UID: \"66512154-e5a4-4d46-9d1b-a091a9f9631d\") " Feb 17 14:15:51 crc kubenswrapper[4804]: I0217 14:15:51.732517 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66512154-e5a4-4d46-9d1b-a091a9f9631d-catalog-content\") pod \"66512154-e5a4-4d46-9d1b-a091a9f9631d\" (UID: \"66512154-e5a4-4d46-9d1b-a091a9f9631d\") " Feb 17 14:15:51 crc kubenswrapper[4804]: I0217 14:15:51.733469 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66512154-e5a4-4d46-9d1b-a091a9f9631d-utilities" (OuterVolumeSpecName: "utilities") pod "66512154-e5a4-4d46-9d1b-a091a9f9631d" (UID: "66512154-e5a4-4d46-9d1b-a091a9f9631d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:15:51 crc kubenswrapper[4804]: I0217 14:15:51.734448 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66512154-e5a4-4d46-9d1b-a091a9f9631d-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:15:51 crc kubenswrapper[4804]: I0217 14:15:51.745546 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66512154-e5a4-4d46-9d1b-a091a9f9631d-kube-api-access-7hj8g" (OuterVolumeSpecName: "kube-api-access-7hj8g") pod "66512154-e5a4-4d46-9d1b-a091a9f9631d" (UID: "66512154-e5a4-4d46-9d1b-a091a9f9631d"). InnerVolumeSpecName "kube-api-access-7hj8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:15:51 crc kubenswrapper[4804]: I0217 14:15:51.784330 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66512154-e5a4-4d46-9d1b-a091a9f9631d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "66512154-e5a4-4d46-9d1b-a091a9f9631d" (UID: "66512154-e5a4-4d46-9d1b-a091a9f9631d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:15:51 crc kubenswrapper[4804]: I0217 14:15:51.836653 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hj8g\" (UniqueName: \"kubernetes.io/projected/66512154-e5a4-4d46-9d1b-a091a9f9631d-kube-api-access-7hj8g\") on node \"crc\" DevicePath \"\"" Feb 17 14:15:51 crc kubenswrapper[4804]: I0217 14:15:51.836703 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66512154-e5a4-4d46-9d1b-a091a9f9631d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:15:52 crc kubenswrapper[4804]: I0217 14:15:52.099839 4804 generic.go:334] "Generic (PLEG): container finished" podID="66512154-e5a4-4d46-9d1b-a091a9f9631d" containerID="18a87c38e30d94a34dbec876b7db21190f53b3ef286e038634bac0605fb31ffa" exitCode=0 Feb 17 14:15:52 crc kubenswrapper[4804]: I0217 14:15:52.099925 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mhl95" Feb 17 14:15:52 crc kubenswrapper[4804]: I0217 14:15:52.099907 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhl95" event={"ID":"66512154-e5a4-4d46-9d1b-a091a9f9631d","Type":"ContainerDied","Data":"18a87c38e30d94a34dbec876b7db21190f53b3ef286e038634bac0605fb31ffa"} Feb 17 14:15:52 crc kubenswrapper[4804]: I0217 14:15:52.100096 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhl95" event={"ID":"66512154-e5a4-4d46-9d1b-a091a9f9631d","Type":"ContainerDied","Data":"75a2dce20291625dd97df834d1901fc5bd4bff2bee391da393378fee4ed223cb"} Feb 17 14:15:52 crc kubenswrapper[4804]: I0217 14:15:52.100131 4804 scope.go:117] "RemoveContainer" containerID="18a87c38e30d94a34dbec876b7db21190f53b3ef286e038634bac0605fb31ffa" Feb 17 14:15:52 crc kubenswrapper[4804]: I0217 14:15:52.135339 4804 scope.go:117] "RemoveContainer" containerID="fc7a42701f50553fc13c96957a701885c01b685335b98aadf5b0ab6df73eda0d" Feb 17 14:15:52 crc kubenswrapper[4804]: I0217 14:15:52.151843 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mhl95"] Feb 17 14:15:52 crc kubenswrapper[4804]: I0217 14:15:52.163161 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mhl95"] Feb 17 14:15:52 crc kubenswrapper[4804]: I0217 14:15:52.169304 4804 scope.go:117] "RemoveContainer" containerID="9dfdcb0d9f1f2a6e6fee92b77d55f2bcf82e3cb7aa6edcaeca0ef57b91dae36a" Feb 17 14:15:52 crc kubenswrapper[4804]: I0217 14:15:52.205167 4804 scope.go:117] "RemoveContainer" containerID="18a87c38e30d94a34dbec876b7db21190f53b3ef286e038634bac0605fb31ffa" Feb 17 14:15:52 crc kubenswrapper[4804]: E0217 14:15:52.205613 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18a87c38e30d94a34dbec876b7db21190f53b3ef286e038634bac0605fb31ffa\": container with ID starting with 18a87c38e30d94a34dbec876b7db21190f53b3ef286e038634bac0605fb31ffa not found: ID does not exist" containerID="18a87c38e30d94a34dbec876b7db21190f53b3ef286e038634bac0605fb31ffa" Feb 17 14:15:52 crc kubenswrapper[4804]: I0217 14:15:52.205650 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18a87c38e30d94a34dbec876b7db21190f53b3ef286e038634bac0605fb31ffa"} err="failed to get container status \"18a87c38e30d94a34dbec876b7db21190f53b3ef286e038634bac0605fb31ffa\": rpc error: code = NotFound desc = could not find container \"18a87c38e30d94a34dbec876b7db21190f53b3ef286e038634bac0605fb31ffa\": container with ID starting with 18a87c38e30d94a34dbec876b7db21190f53b3ef286e038634bac0605fb31ffa not found: ID does not exist" Feb 17 14:15:52 crc kubenswrapper[4804]: I0217 14:15:52.205678 4804 scope.go:117] "RemoveContainer" containerID="fc7a42701f50553fc13c96957a701885c01b685335b98aadf5b0ab6df73eda0d" Feb 17 14:15:52 crc kubenswrapper[4804]: E0217 14:15:52.205943 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc7a42701f50553fc13c96957a701885c01b685335b98aadf5b0ab6df73eda0d\": container with ID starting with fc7a42701f50553fc13c96957a701885c01b685335b98aadf5b0ab6df73eda0d not found: ID does not exist" containerID="fc7a42701f50553fc13c96957a701885c01b685335b98aadf5b0ab6df73eda0d" Feb 17 14:15:52 crc kubenswrapper[4804]: I0217 14:15:52.205993 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc7a42701f50553fc13c96957a701885c01b685335b98aadf5b0ab6df73eda0d"} err="failed to get container status \"fc7a42701f50553fc13c96957a701885c01b685335b98aadf5b0ab6df73eda0d\": rpc error: code = NotFound desc = could not find container \"fc7a42701f50553fc13c96957a701885c01b685335b98aadf5b0ab6df73eda0d\": container with ID starting with fc7a42701f50553fc13c96957a701885c01b685335b98aadf5b0ab6df73eda0d not found: ID does not exist" Feb 17 14:15:52 crc kubenswrapper[4804]: I0217 14:15:52.206012 4804 scope.go:117] "RemoveContainer" containerID="9dfdcb0d9f1f2a6e6fee92b77d55f2bcf82e3cb7aa6edcaeca0ef57b91dae36a" Feb 17 14:15:52 crc kubenswrapper[4804]: E0217 14:15:52.206264 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dfdcb0d9f1f2a6e6fee92b77d55f2bcf82e3cb7aa6edcaeca0ef57b91dae36a\": container with ID starting with 9dfdcb0d9f1f2a6e6fee92b77d55f2bcf82e3cb7aa6edcaeca0ef57b91dae36a not found: ID does not exist" containerID="9dfdcb0d9f1f2a6e6fee92b77d55f2bcf82e3cb7aa6edcaeca0ef57b91dae36a" Feb 17 14:15:52 crc kubenswrapper[4804]: I0217 14:15:52.206294 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dfdcb0d9f1f2a6e6fee92b77d55f2bcf82e3cb7aa6edcaeca0ef57b91dae36a"} err="failed to get container status \"9dfdcb0d9f1f2a6e6fee92b77d55f2bcf82e3cb7aa6edcaeca0ef57b91dae36a\": rpc error: code = NotFound desc = could not find container \"9dfdcb0d9f1f2a6e6fee92b77d55f2bcf82e3cb7aa6edcaeca0ef57b91dae36a\": container with ID starting with 9dfdcb0d9f1f2a6e6fee92b77d55f2bcf82e3cb7aa6edcaeca0ef57b91dae36a not found: ID does not exist" Feb 17 14:15:52 crc kubenswrapper[4804]: I0217 14:15:52.587441 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66512154-e5a4-4d46-9d1b-a091a9f9631d" path="/var/lib/kubelet/pods/66512154-e5a4-4d46-9d1b-a091a9f9631d/volumes" Feb 17 14:16:25 crc kubenswrapper[4804]: I0217 14:16:25.835682 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:16:25 crc kubenswrapper[4804]: I0217 14:16:25.836398 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:16:55 crc kubenswrapper[4804]: I0217 14:16:55.835828 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:16:55 crc kubenswrapper[4804]: I0217 14:16:55.838458 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:17:25 crc kubenswrapper[4804]: I0217 14:17:25.835838 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:17:25 crc kubenswrapper[4804]: I0217 14:17:25.836561 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:17:25 crc kubenswrapper[4804]: I0217 14:17:25.836633 4804 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 14:17:25 crc kubenswrapper[4804]: I0217 14:17:25.837696 4804 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce"} pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 14:17:25 crc kubenswrapper[4804]: I0217 14:17:25.837768 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" containerID="cri-o://cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" gracePeriod=600 Feb 17 14:17:25 crc kubenswrapper[4804]: E0217 14:17:25.966914 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:17:26 crc kubenswrapper[4804]: I0217 14:17:26.966435 4804 generic.go:334] "Generic (PLEG): container finished" podID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" exitCode=0 Feb 17 14:17:26 crc kubenswrapper[4804]: I0217 14:17:26.966529 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerDied","Data":"cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce"} Feb 17 14:17:26 crc kubenswrapper[4804]: I0217 14:17:26.966884 4804 scope.go:117] "RemoveContainer" containerID="3270aad0cb169aa901bc212cfebf2f76e758028c073b7389992ac0738318b723" Feb 17 14:17:26 crc kubenswrapper[4804]: I0217 14:17:26.967914 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:17:26 crc kubenswrapper[4804]: E0217 14:17:26.968514 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:17:37 crc kubenswrapper[4804]: I0217 14:17:37.574810 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:17:37 crc kubenswrapper[4804]: E0217 14:17:37.575966 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:17:48 crc kubenswrapper[4804]: I0217 14:17:48.575163 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:17:48 crc kubenswrapper[4804]: E0217 14:17:48.576061 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:18:03 crc kubenswrapper[4804]: I0217 14:18:03.574371 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:18:03 crc kubenswrapper[4804]: E0217 14:18:03.575317 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:18:17 crc kubenswrapper[4804]: I0217 14:18:17.574836 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:18:17 crc kubenswrapper[4804]: E0217 14:18:17.576418 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:18:31 crc kubenswrapper[4804]: I0217 14:18:31.574892 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:18:31 crc kubenswrapper[4804]: E0217 14:18:31.576219 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:18:42 crc kubenswrapper[4804]: I0217 14:18:42.574038 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:18:42 crc kubenswrapper[4804]: E0217 14:18:42.574828 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:18:54 crc kubenswrapper[4804]: I0217 14:18:54.574749 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:18:54 crc kubenswrapper[4804]: E0217 14:18:54.575622 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:19:08 crc kubenswrapper[4804]: I0217 14:19:08.574744 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:19:08 crc kubenswrapper[4804]: E0217 14:19:08.575794 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:19:19 crc kubenswrapper[4804]: I0217 14:19:19.574037 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:19:19 crc kubenswrapper[4804]: E0217 14:19:19.574826 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:19:32 crc kubenswrapper[4804]: I0217 14:19:32.574603 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:19:32 crc kubenswrapper[4804]: E0217 14:19:32.575334 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:19:43 crc kubenswrapper[4804]: I0217 14:19:43.573837 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:19:43 crc kubenswrapper[4804]: E0217 14:19:43.574658 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:19:55 crc kubenswrapper[4804]: I0217 14:19:55.574350 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:19:55 crc kubenswrapper[4804]: E0217 14:19:55.575183 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:20:08 crc kubenswrapper[4804]: I0217 14:20:08.575123 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:20:08 crc kubenswrapper[4804]: E0217 14:20:08.576351 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:20:22 crc kubenswrapper[4804]: I0217 14:20:22.574348 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:20:22 crc kubenswrapper[4804]: E0217 14:20:22.575324 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:20:36 crc kubenswrapper[4804]: I0217 14:20:36.580163 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:20:36 crc kubenswrapper[4804]: E0217 14:20:36.581096 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:20:50 crc kubenswrapper[4804]: I0217 14:20:50.573860 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:20:50 crc kubenswrapper[4804]: E0217 14:20:50.574779 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:21:03 crc kubenswrapper[4804]: I0217 14:21:03.575029 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:21:03 crc kubenswrapper[4804]: E0217 14:21:03.577145 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:21:06 crc kubenswrapper[4804]: I0217 14:21:06.391115 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nnnnl"] Feb 17 14:21:06 crc kubenswrapper[4804]: E0217 14:21:06.391620 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66512154-e5a4-4d46-9d1b-a091a9f9631d" containerName="extract-utilities" Feb 17 14:21:06 crc kubenswrapper[4804]: I0217 14:21:06.391892 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="66512154-e5a4-4d46-9d1b-a091a9f9631d" containerName="extract-utilities" Feb 17 14:21:06 crc kubenswrapper[4804]: E0217 14:21:06.391923 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66512154-e5a4-4d46-9d1b-a091a9f9631d" containerName="registry-server" Feb 17 14:21:06 crc kubenswrapper[4804]: I0217 14:21:06.391933 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="66512154-e5a4-4d46-9d1b-a091a9f9631d" containerName="registry-server" Feb 17 14:21:06 crc kubenswrapper[4804]: E0217 14:21:06.391963 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66512154-e5a4-4d46-9d1b-a091a9f9631d" containerName="extract-content" Feb 17 14:21:06 crc kubenswrapper[4804]: I0217 14:21:06.391971 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="66512154-e5a4-4d46-9d1b-a091a9f9631d" containerName="extract-content" Feb 17 14:21:06 crc kubenswrapper[4804]: I0217 14:21:06.392174 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="66512154-e5a4-4d46-9d1b-a091a9f9631d" containerName="registry-server" Feb 17 14:21:06 crc kubenswrapper[4804]: I0217 14:21:06.393653 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nnnnl" Feb 17 14:21:06 crc kubenswrapper[4804]: I0217 14:21:06.412724 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nnnnl"] Feb 17 14:21:06 crc kubenswrapper[4804]: I0217 14:21:06.542827 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49a4d451-c363-4a82-aa9d-78f76fb0eb2f-catalog-content\") pod \"redhat-operators-nnnnl\" (UID: \"49a4d451-c363-4a82-aa9d-78f76fb0eb2f\") " pod="openshift-marketplace/redhat-operators-nnnnl" Feb 17 14:21:06 crc kubenswrapper[4804]: I0217 14:21:06.542905 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49a4d451-c363-4a82-aa9d-78f76fb0eb2f-utilities\") pod \"redhat-operators-nnnnl\" (UID: \"49a4d451-c363-4a82-aa9d-78f76fb0eb2f\") " pod="openshift-marketplace/redhat-operators-nnnnl" Feb 17 14:21:06 crc kubenswrapper[4804]: I0217 14:21:06.542998 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qb85\" (UniqueName: \"kubernetes.io/projected/49a4d451-c363-4a82-aa9d-78f76fb0eb2f-kube-api-access-5qb85\") pod \"redhat-operators-nnnnl\" (UID: \"49a4d451-c363-4a82-aa9d-78f76fb0eb2f\") " pod="openshift-marketplace/redhat-operators-nnnnl" Feb 17 14:21:06 crc kubenswrapper[4804]: I0217 14:21:06.644583 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49a4d451-c363-4a82-aa9d-78f76fb0eb2f-utilities\") pod \"redhat-operators-nnnnl\" (UID: \"49a4d451-c363-4a82-aa9d-78f76fb0eb2f\") " pod="openshift-marketplace/redhat-operators-nnnnl" Feb 17 14:21:06 crc kubenswrapper[4804]: I0217 14:21:06.644685 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qb85\" (UniqueName: \"kubernetes.io/projected/49a4d451-c363-4a82-aa9d-78f76fb0eb2f-kube-api-access-5qb85\") pod \"redhat-operators-nnnnl\" (UID: \"49a4d451-c363-4a82-aa9d-78f76fb0eb2f\") " pod="openshift-marketplace/redhat-operators-nnnnl" Feb 17 14:21:06 crc kubenswrapper[4804]: I0217 14:21:06.644810 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49a4d451-c363-4a82-aa9d-78f76fb0eb2f-catalog-content\") pod \"redhat-operators-nnnnl\" (UID: \"49a4d451-c363-4a82-aa9d-78f76fb0eb2f\") " pod="openshift-marketplace/redhat-operators-nnnnl" Feb 17 14:21:06 crc kubenswrapper[4804]: I0217 14:21:06.645284 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49a4d451-c363-4a82-aa9d-78f76fb0eb2f-catalog-content\") pod \"redhat-operators-nnnnl\" (UID: \"49a4d451-c363-4a82-aa9d-78f76fb0eb2f\") " pod="openshift-marketplace/redhat-operators-nnnnl" Feb 17 14:21:06 crc kubenswrapper[4804]: I0217 14:21:06.645714 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49a4d451-c363-4a82-aa9d-78f76fb0eb2f-utilities\") pod \"redhat-operators-nnnnl\" (UID: \"49a4d451-c363-4a82-aa9d-78f76fb0eb2f\") " pod="openshift-marketplace/redhat-operators-nnnnl" Feb 17 14:21:06 crc kubenswrapper[4804]: I0217 14:21:06.669681 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qb85\" (UniqueName: \"kubernetes.io/projected/49a4d451-c363-4a82-aa9d-78f76fb0eb2f-kube-api-access-5qb85\") pod \"redhat-operators-nnnnl\" (UID: \"49a4d451-c363-4a82-aa9d-78f76fb0eb2f\") " pod="openshift-marketplace/redhat-operators-nnnnl" Feb 17 14:21:06 crc kubenswrapper[4804]: I0217 14:21:06.730750 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nnnnl" Feb 17 14:21:07 crc kubenswrapper[4804]: I0217 14:21:07.216497 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nnnnl"] Feb 17 14:21:07 crc kubenswrapper[4804]: I0217 14:21:07.459718 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnnnl" event={"ID":"49a4d451-c363-4a82-aa9d-78f76fb0eb2f","Type":"ContainerStarted","Data":"afdba798f22337cb141f4e18f48d4777e222a4f386e4ae115dbbb8ab5c633f89"} Feb 17 14:21:08 crc kubenswrapper[4804]: I0217 14:21:08.467902 4804 generic.go:334] "Generic (PLEG): container finished" podID="49a4d451-c363-4a82-aa9d-78f76fb0eb2f" containerID="9cbcce906e45e234e41e79feed19f281c1d1e0f6849fa7a707466990d423c9ad" exitCode=0 Feb 17 14:21:08 crc kubenswrapper[4804]: I0217 14:21:08.468038 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnnnl" event={"ID":"49a4d451-c363-4a82-aa9d-78f76fb0eb2f","Type":"ContainerDied","Data":"9cbcce906e45e234e41e79feed19f281c1d1e0f6849fa7a707466990d423c9ad"} Feb 17 14:21:08 crc kubenswrapper[4804]: I0217 14:21:08.470927 4804 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 14:21:09 crc kubenswrapper[4804]: I0217 14:21:09.478001 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnnnl" event={"ID":"49a4d451-c363-4a82-aa9d-78f76fb0eb2f","Type":"ContainerStarted","Data":"b04e9a314d6d32f142690819301de27dc49953afdfa3fed256984f48c40569c1"} Feb 17 14:21:10 crc kubenswrapper[4804]: I0217 14:21:10.491489 4804 generic.go:334] "Generic (PLEG): container finished" podID="49a4d451-c363-4a82-aa9d-78f76fb0eb2f" containerID="b04e9a314d6d32f142690819301de27dc49953afdfa3fed256984f48c40569c1" exitCode=0 Feb 17 14:21:10 crc kubenswrapper[4804]: I0217 14:21:10.491544 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnnnl" event={"ID":"49a4d451-c363-4a82-aa9d-78f76fb0eb2f","Type":"ContainerDied","Data":"b04e9a314d6d32f142690819301de27dc49953afdfa3fed256984f48c40569c1"} Feb 17 14:21:11 crc kubenswrapper[4804]: I0217 14:21:11.505261 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnnnl" event={"ID":"49a4d451-c363-4a82-aa9d-78f76fb0eb2f","Type":"ContainerStarted","Data":"6500bb8773c862debd41ccc3567a629f48c49330d5888564735a416aad17ddc1"} Feb 17 14:21:11 crc kubenswrapper[4804]: I0217 14:21:11.534048 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nnnnl" podStartSLOduration=3.096888525 podStartE2EDuration="5.534023433s" podCreationTimestamp="2026-02-17 14:21:06 +0000 UTC" firstStartedPulling="2026-02-17 14:21:08.470449108 +0000 UTC m=+3342.581868445" lastFinishedPulling="2026-02-17 14:21:10.907584016 +0000 UTC m=+3345.019003353" observedRunningTime="2026-02-17 14:21:11.522113489 +0000 UTC m=+3345.633532826" watchObservedRunningTime="2026-02-17 14:21:11.534023433 +0000 UTC m=+3345.645442770" Feb 17 14:21:16 crc kubenswrapper[4804]: I0217 14:21:16.600584 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:21:16 crc kubenswrapper[4804]: E0217 14:21:16.602063 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:21:16 crc kubenswrapper[4804]: I0217 14:21:16.733084 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nnnnl" Feb 17 14:21:16 crc kubenswrapper[4804]: I0217 14:21:16.733462 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nnnnl" Feb 17 14:21:17 crc kubenswrapper[4804]: I0217 14:21:17.786874 4804 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nnnnl" podUID="49a4d451-c363-4a82-aa9d-78f76fb0eb2f" containerName="registry-server" probeResult="failure" output=< Feb 17 14:21:17 crc kubenswrapper[4804]: timeout: failed to connect service ":50051" within 1s Feb 17 14:21:17 crc kubenswrapper[4804]: > Feb 17 14:21:26 crc kubenswrapper[4804]: I0217 14:21:26.782109 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nnnnl" Feb 17 14:21:26 crc kubenswrapper[4804]: I0217 14:21:26.834256 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nnnnl" Feb 17 14:21:27 crc kubenswrapper[4804]: I0217 14:21:27.022971 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nnnnl"] Feb 17 14:21:27 crc kubenswrapper[4804]: I0217 14:21:27.573895 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:21:27 crc kubenswrapper[4804]: E0217 14:21:27.574173 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:21:28 crc kubenswrapper[4804]: I0217 14:21:28.652958 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nnnnl" podUID="49a4d451-c363-4a82-aa9d-78f76fb0eb2f" containerName="registry-server" containerID="cri-o://6500bb8773c862debd41ccc3567a629f48c49330d5888564735a416aad17ddc1" gracePeriod=2 Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.186136 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nnnnl" Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.294400 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49a4d451-c363-4a82-aa9d-78f76fb0eb2f-catalog-content\") pod \"49a4d451-c363-4a82-aa9d-78f76fb0eb2f\" (UID: \"49a4d451-c363-4a82-aa9d-78f76fb0eb2f\") " Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.294610 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qb85\" (UniqueName: \"kubernetes.io/projected/49a4d451-c363-4a82-aa9d-78f76fb0eb2f-kube-api-access-5qb85\") pod \"49a4d451-c363-4a82-aa9d-78f76fb0eb2f\" (UID: \"49a4d451-c363-4a82-aa9d-78f76fb0eb2f\") " Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.294689 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49a4d451-c363-4a82-aa9d-78f76fb0eb2f-utilities\") pod \"49a4d451-c363-4a82-aa9d-78f76fb0eb2f\" (UID: \"49a4d451-c363-4a82-aa9d-78f76fb0eb2f\") " Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.295824 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49a4d451-c363-4a82-aa9d-78f76fb0eb2f-utilities" (OuterVolumeSpecName: "utilities") pod "49a4d451-c363-4a82-aa9d-78f76fb0eb2f" (UID: "49a4d451-c363-4a82-aa9d-78f76fb0eb2f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.306383 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49a4d451-c363-4a82-aa9d-78f76fb0eb2f-kube-api-access-5qb85" (OuterVolumeSpecName: "kube-api-access-5qb85") pod "49a4d451-c363-4a82-aa9d-78f76fb0eb2f" (UID: "49a4d451-c363-4a82-aa9d-78f76fb0eb2f"). InnerVolumeSpecName "kube-api-access-5qb85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.397273 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qb85\" (UniqueName: \"kubernetes.io/projected/49a4d451-c363-4a82-aa9d-78f76fb0eb2f-kube-api-access-5qb85\") on node \"crc\" DevicePath \"\"" Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.397310 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49a4d451-c363-4a82-aa9d-78f76fb0eb2f-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.423950 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49a4d451-c363-4a82-aa9d-78f76fb0eb2f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49a4d451-c363-4a82-aa9d-78f76fb0eb2f" (UID: "49a4d451-c363-4a82-aa9d-78f76fb0eb2f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.499234 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49a4d451-c363-4a82-aa9d-78f76fb0eb2f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.664812 4804 generic.go:334] "Generic (PLEG): container finished" podID="49a4d451-c363-4a82-aa9d-78f76fb0eb2f" containerID="6500bb8773c862debd41ccc3567a629f48c49330d5888564735a416aad17ddc1" exitCode=0 Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.664874 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnnnl" event={"ID":"49a4d451-c363-4a82-aa9d-78f76fb0eb2f","Type":"ContainerDied","Data":"6500bb8773c862debd41ccc3567a629f48c49330d5888564735a416aad17ddc1"} Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.664903 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nnnnl" Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.664930 4804 scope.go:117] "RemoveContainer" containerID="6500bb8773c862debd41ccc3567a629f48c49330d5888564735a416aad17ddc1" Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.664913 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnnnl" event={"ID":"49a4d451-c363-4a82-aa9d-78f76fb0eb2f","Type":"ContainerDied","Data":"afdba798f22337cb141f4e18f48d4777e222a4f386e4ae115dbbb8ab5c633f89"} Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.692349 4804 scope.go:117] "RemoveContainer" containerID="b04e9a314d6d32f142690819301de27dc49953afdfa3fed256984f48c40569c1" Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.707191 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nnnnl"] Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.716590 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nnnnl"] Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.735597 4804 scope.go:117] "RemoveContainer" containerID="9cbcce906e45e234e41e79feed19f281c1d1e0f6849fa7a707466990d423c9ad" Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.770458 4804 scope.go:117] "RemoveContainer" containerID="6500bb8773c862debd41ccc3567a629f48c49330d5888564735a416aad17ddc1" Feb 17 14:21:29 crc kubenswrapper[4804]: E0217 14:21:29.770829 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6500bb8773c862debd41ccc3567a629f48c49330d5888564735a416aad17ddc1\": container with ID starting with 6500bb8773c862debd41ccc3567a629f48c49330d5888564735a416aad17ddc1 not found: ID does not exist" containerID="6500bb8773c862debd41ccc3567a629f48c49330d5888564735a416aad17ddc1" Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.770878 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6500bb8773c862debd41ccc3567a629f48c49330d5888564735a416aad17ddc1"} err="failed to get container status \"6500bb8773c862debd41ccc3567a629f48c49330d5888564735a416aad17ddc1\": rpc error: code = NotFound desc = could not find container \"6500bb8773c862debd41ccc3567a629f48c49330d5888564735a416aad17ddc1\": container with ID starting with 6500bb8773c862debd41ccc3567a629f48c49330d5888564735a416aad17ddc1 not found: ID does not exist" Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.770919 4804 scope.go:117] "RemoveContainer" containerID="b04e9a314d6d32f142690819301de27dc49953afdfa3fed256984f48c40569c1" Feb 17 14:21:29 crc kubenswrapper[4804]: E0217 14:21:29.771319 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b04e9a314d6d32f142690819301de27dc49953afdfa3fed256984f48c40569c1\": container with ID starting with b04e9a314d6d32f142690819301de27dc49953afdfa3fed256984f48c40569c1 not found: ID does not exist" containerID="b04e9a314d6d32f142690819301de27dc49953afdfa3fed256984f48c40569c1" Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.771344 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b04e9a314d6d32f142690819301de27dc49953afdfa3fed256984f48c40569c1"} err="failed to get container status \"b04e9a314d6d32f142690819301de27dc49953afdfa3fed256984f48c40569c1\": rpc error: code = NotFound desc = could not find container \"b04e9a314d6d32f142690819301de27dc49953afdfa3fed256984f48c40569c1\": container with ID starting with b04e9a314d6d32f142690819301de27dc49953afdfa3fed256984f48c40569c1 not found: ID does not exist" Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.771360 4804 scope.go:117] "RemoveContainer" containerID="9cbcce906e45e234e41e79feed19f281c1d1e0f6849fa7a707466990d423c9ad" Feb 17 14:21:29 crc kubenswrapper[4804]: E0217 14:21:29.771552 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cbcce906e45e234e41e79feed19f281c1d1e0f6849fa7a707466990d423c9ad\": container with ID starting with 9cbcce906e45e234e41e79feed19f281c1d1e0f6849fa7a707466990d423c9ad not found: ID does not exist" containerID="9cbcce906e45e234e41e79feed19f281c1d1e0f6849fa7a707466990d423c9ad" Feb 17 14:21:29 crc kubenswrapper[4804]: I0217 14:21:29.771580 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cbcce906e45e234e41e79feed19f281c1d1e0f6849fa7a707466990d423c9ad"} err="failed to get container status \"9cbcce906e45e234e41e79feed19f281c1d1e0f6849fa7a707466990d423c9ad\": rpc error: code = NotFound desc = could not find container \"9cbcce906e45e234e41e79feed19f281c1d1e0f6849fa7a707466990d423c9ad\": container with ID starting with 9cbcce906e45e234e41e79feed19f281c1d1e0f6849fa7a707466990d423c9ad not found: ID does not exist" Feb 17 14:21:30 crc kubenswrapper[4804]: I0217 14:21:30.584634 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49a4d451-c363-4a82-aa9d-78f76fb0eb2f" path="/var/lib/kubelet/pods/49a4d451-c363-4a82-aa9d-78f76fb0eb2f/volumes" Feb 17 14:21:40 crc kubenswrapper[4804]: I0217 14:21:40.575532 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:21:40 crc kubenswrapper[4804]: E0217 14:21:40.577020 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:21:54 crc kubenswrapper[4804]: I0217 14:21:54.573903 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:21:54 crc kubenswrapper[4804]: E0217 14:21:54.574635 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:22:08 crc kubenswrapper[4804]: I0217 14:22:08.574183 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:22:08 crc kubenswrapper[4804]: E0217 14:22:08.575092 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:22:22 crc kubenswrapper[4804]: I0217 14:22:22.574131 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:22:22 crc kubenswrapper[4804]: E0217 14:22:22.575094 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:22:34 crc kubenswrapper[4804]: I0217 14:22:34.574541 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:22:35 crc kubenswrapper[4804]: I0217 14:22:35.298183 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerStarted","Data":"7d4a85eb120e127ec0b1aabea32973bfd4724fea056face9a5b718c636d4f49a"} Feb 17 14:22:36 crc kubenswrapper[4804]: I0217 14:22:36.324710 4804 generic.go:334] "Generic (PLEG): container finished" podID="f7b246dc-1d07-4725-b471-88fe82584d24" containerID="d537c8e502573d470d3444dc025ba077411e9d8c16e3d0c7fcbea501f31e4c98" exitCode=0 Feb 17 14:22:36 crc kubenswrapper[4804]: I0217 14:22:36.324991 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"f7b246dc-1d07-4725-b471-88fe82584d24","Type":"ContainerDied","Data":"d537c8e502573d470d3444dc025ba077411e9d8c16e3d0c7fcbea501f31e4c98"} Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.754407 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.831061 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f7b246dc-1d07-4725-b471-88fe82584d24-openstack-config-secret\") pod \"f7b246dc-1d07-4725-b471-88fe82584d24\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.831131 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/f7b246dc-1d07-4725-b471-88fe82584d24-test-operator-ephemeral-workdir\") pod \"f7b246dc-1d07-4725-b471-88fe82584d24\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.831165 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-548bt\" (UniqueName: \"kubernetes.io/projected/f7b246dc-1d07-4725-b471-88fe82584d24-kube-api-access-548bt\") pod \"f7b246dc-1d07-4725-b471-88fe82584d24\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.831183 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/f7b246dc-1d07-4725-b471-88fe82584d24-test-operator-ephemeral-temporary\") pod \"f7b246dc-1d07-4725-b471-88fe82584d24\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.831240 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f7b246dc-1d07-4725-b471-88fe82584d24-openstack-config\") pod \"f7b246dc-1d07-4725-b471-88fe82584d24\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.831312 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f7b246dc-1d07-4725-b471-88fe82584d24-config-data\") pod \"f7b246dc-1d07-4725-b471-88fe82584d24\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.831347 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f7b246dc-1d07-4725-b471-88fe82584d24-ssh-key\") pod \"f7b246dc-1d07-4725-b471-88fe82584d24\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.831377 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"f7b246dc-1d07-4725-b471-88fe82584d24\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.831470 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/f7b246dc-1d07-4725-b471-88fe82584d24-ca-certs\") pod \"f7b246dc-1d07-4725-b471-88fe82584d24\" (UID: \"f7b246dc-1d07-4725-b471-88fe82584d24\") " Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.832318 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7b246dc-1d07-4725-b471-88fe82584d24-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "f7b246dc-1d07-4725-b471-88fe82584d24" (UID: "f7b246dc-1d07-4725-b471-88fe82584d24"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.832908 4804 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/f7b246dc-1d07-4725-b471-88fe82584d24-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.832914 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7b246dc-1d07-4725-b471-88fe82584d24-config-data" (OuterVolumeSpecName: "config-data") pod "f7b246dc-1d07-4725-b471-88fe82584d24" (UID: "f7b246dc-1d07-4725-b471-88fe82584d24"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.836763 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7b246dc-1d07-4725-b471-88fe82584d24-kube-api-access-548bt" (OuterVolumeSpecName: "kube-api-access-548bt") pod "f7b246dc-1d07-4725-b471-88fe82584d24" (UID: "f7b246dc-1d07-4725-b471-88fe82584d24"). InnerVolumeSpecName "kube-api-access-548bt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.839101 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "test-operator-logs") pod "f7b246dc-1d07-4725-b471-88fe82584d24" (UID: "f7b246dc-1d07-4725-b471-88fe82584d24"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.835509 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7b246dc-1d07-4725-b471-88fe82584d24-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "f7b246dc-1d07-4725-b471-88fe82584d24" (UID: "f7b246dc-1d07-4725-b471-88fe82584d24"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.860950 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7b246dc-1d07-4725-b471-88fe82584d24-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "f7b246dc-1d07-4725-b471-88fe82584d24" (UID: "f7b246dc-1d07-4725-b471-88fe82584d24"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.861803 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7b246dc-1d07-4725-b471-88fe82584d24-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "f7b246dc-1d07-4725-b471-88fe82584d24" (UID: "f7b246dc-1d07-4725-b471-88fe82584d24"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.863311 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7b246dc-1d07-4725-b471-88fe82584d24-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f7b246dc-1d07-4725-b471-88fe82584d24" (UID: "f7b246dc-1d07-4725-b471-88fe82584d24"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.878768 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7b246dc-1d07-4725-b471-88fe82584d24-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "f7b246dc-1d07-4725-b471-88fe82584d24" (UID: "f7b246dc-1d07-4725-b471-88fe82584d24"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.935095 4804 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f7b246dc-1d07-4725-b471-88fe82584d24-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.935133 4804 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f7b246dc-1d07-4725-b471-88fe82584d24-ssh-key\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.935165 4804 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.935174 4804 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/f7b246dc-1d07-4725-b471-88fe82584d24-ca-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.935186 4804 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f7b246dc-1d07-4725-b471-88fe82584d24-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.935212 4804 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/f7b246dc-1d07-4725-b471-88fe82584d24-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.935224 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-548bt\" (UniqueName: \"kubernetes.io/projected/f7b246dc-1d07-4725-b471-88fe82584d24-kube-api-access-548bt\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.935233 4804 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f7b246dc-1d07-4725-b471-88fe82584d24-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:37 crc kubenswrapper[4804]: I0217 14:22:37.955127 4804 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 17 14:22:38 crc kubenswrapper[4804]: I0217 14:22:38.037230 4804 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:38 crc kubenswrapper[4804]: I0217 14:22:38.343753 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 17 14:22:38 crc kubenswrapper[4804]: I0217 14:22:38.343756 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"f7b246dc-1d07-4725-b471-88fe82584d24","Type":"ContainerDied","Data":"35721c59346596c631486087761565338b01be7cc9c8b0659285af567a265321"} Feb 17 14:22:38 crc kubenswrapper[4804]: I0217 14:22:38.344500 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35721c59346596c631486087761565338b01be7cc9c8b0659285af567a265321" Feb 17 14:22:41 crc kubenswrapper[4804]: I0217 14:22:41.964544 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8chgg"] Feb 17 14:22:41 crc kubenswrapper[4804]: E0217 14:22:41.965511 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49a4d451-c363-4a82-aa9d-78f76fb0eb2f" containerName="extract-content" Feb 17 14:22:41 crc kubenswrapper[4804]: I0217 14:22:41.965528 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="49a4d451-c363-4a82-aa9d-78f76fb0eb2f" containerName="extract-content" Feb 17 14:22:41 crc kubenswrapper[4804]: E0217 14:22:41.965541 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7b246dc-1d07-4725-b471-88fe82584d24" containerName="tempest-tests-tempest-tests-runner" Feb 17 14:22:41 crc kubenswrapper[4804]: I0217 14:22:41.965548 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7b246dc-1d07-4725-b471-88fe82584d24" containerName="tempest-tests-tempest-tests-runner" Feb 17 14:22:41 crc kubenswrapper[4804]: E0217 14:22:41.965562 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49a4d451-c363-4a82-aa9d-78f76fb0eb2f" containerName="extract-utilities" Feb 17 14:22:41 crc kubenswrapper[4804]: I0217 14:22:41.965568 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="49a4d451-c363-4a82-aa9d-78f76fb0eb2f" containerName="extract-utilities" Feb 17 14:22:41 crc kubenswrapper[4804]: E0217 14:22:41.965583 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49a4d451-c363-4a82-aa9d-78f76fb0eb2f" containerName="registry-server" Feb 17 14:22:41 crc kubenswrapper[4804]: I0217 14:22:41.965589 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="49a4d451-c363-4a82-aa9d-78f76fb0eb2f" containerName="registry-server" Feb 17 14:22:41 crc kubenswrapper[4804]: I0217 14:22:41.965752 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="49a4d451-c363-4a82-aa9d-78f76fb0eb2f" containerName="registry-server" Feb 17 14:22:41 crc kubenswrapper[4804]: I0217 14:22:41.965764 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7b246dc-1d07-4725-b471-88fe82584d24" containerName="tempest-tests-tempest-tests-runner" Feb 17 14:22:41 crc kubenswrapper[4804]: I0217 14:22:41.967212 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8chgg" Feb 17 14:22:41 crc kubenswrapper[4804]: I0217 14:22:41.979613 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8chgg"] Feb 17 14:22:42 crc kubenswrapper[4804]: I0217 14:22:42.015602 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dde1b880-fcbe-493d-85e0-44763ee6e1f8-utilities\") pod \"redhat-marketplace-8chgg\" (UID: \"dde1b880-fcbe-493d-85e0-44763ee6e1f8\") " pod="openshift-marketplace/redhat-marketplace-8chgg" Feb 17 14:22:42 crc kubenswrapper[4804]: I0217 14:22:42.015669 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dde1b880-fcbe-493d-85e0-44763ee6e1f8-catalog-content\") pod \"redhat-marketplace-8chgg\" (UID: \"dde1b880-fcbe-493d-85e0-44763ee6e1f8\") " pod="openshift-marketplace/redhat-marketplace-8chgg" Feb 17 14:22:42 crc kubenswrapper[4804]: I0217 14:22:42.015692 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skv7q\" (UniqueName: \"kubernetes.io/projected/dde1b880-fcbe-493d-85e0-44763ee6e1f8-kube-api-access-skv7q\") pod \"redhat-marketplace-8chgg\" (UID: \"dde1b880-fcbe-493d-85e0-44763ee6e1f8\") " pod="openshift-marketplace/redhat-marketplace-8chgg" Feb 17 14:22:42 crc kubenswrapper[4804]: I0217 14:22:42.117409 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dde1b880-fcbe-493d-85e0-44763ee6e1f8-utilities\") pod \"redhat-marketplace-8chgg\" (UID: \"dde1b880-fcbe-493d-85e0-44763ee6e1f8\") " pod="openshift-marketplace/redhat-marketplace-8chgg" Feb 17 14:22:42 crc kubenswrapper[4804]: I0217 14:22:42.117497 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dde1b880-fcbe-493d-85e0-44763ee6e1f8-catalog-content\") pod \"redhat-marketplace-8chgg\" (UID: \"dde1b880-fcbe-493d-85e0-44763ee6e1f8\") " pod="openshift-marketplace/redhat-marketplace-8chgg" Feb 17 14:22:42 crc kubenswrapper[4804]: I0217 14:22:42.117520 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skv7q\" (UniqueName: \"kubernetes.io/projected/dde1b880-fcbe-493d-85e0-44763ee6e1f8-kube-api-access-skv7q\") pod \"redhat-marketplace-8chgg\" (UID: \"dde1b880-fcbe-493d-85e0-44763ee6e1f8\") " pod="openshift-marketplace/redhat-marketplace-8chgg" Feb 17 14:22:42 crc kubenswrapper[4804]: I0217 14:22:42.118008 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dde1b880-fcbe-493d-85e0-44763ee6e1f8-utilities\") pod \"redhat-marketplace-8chgg\" (UID: \"dde1b880-fcbe-493d-85e0-44763ee6e1f8\") " pod="openshift-marketplace/redhat-marketplace-8chgg" Feb 17 14:22:42 crc kubenswrapper[4804]: I0217 14:22:42.118312 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dde1b880-fcbe-493d-85e0-44763ee6e1f8-catalog-content\") pod \"redhat-marketplace-8chgg\" (UID: \"dde1b880-fcbe-493d-85e0-44763ee6e1f8\") " pod="openshift-marketplace/redhat-marketplace-8chgg" Feb 17 14:22:42 crc kubenswrapper[4804]: I0217 14:22:42.136454 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skv7q\" (UniqueName: \"kubernetes.io/projected/dde1b880-fcbe-493d-85e0-44763ee6e1f8-kube-api-access-skv7q\") pod \"redhat-marketplace-8chgg\" (UID: \"dde1b880-fcbe-493d-85e0-44763ee6e1f8\") " pod="openshift-marketplace/redhat-marketplace-8chgg" Feb 17 14:22:42 crc kubenswrapper[4804]: I0217 14:22:42.289544 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8chgg" Feb 17 14:22:42 crc kubenswrapper[4804]: I0217 14:22:42.746232 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8chgg"] Feb 17 14:22:43 crc kubenswrapper[4804]: I0217 14:22:43.425495 4804 generic.go:334] "Generic (PLEG): container finished" podID="dde1b880-fcbe-493d-85e0-44763ee6e1f8" containerID="4cdc2945c974221ef0ceae74cdf7d55e0de10f77f1b218b174597c608971b348" exitCode=0 Feb 17 14:22:43 crc kubenswrapper[4804]: I0217 14:22:43.425563 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8chgg" event={"ID":"dde1b880-fcbe-493d-85e0-44763ee6e1f8","Type":"ContainerDied","Data":"4cdc2945c974221ef0ceae74cdf7d55e0de10f77f1b218b174597c608971b348"} Feb 17 14:22:43 crc kubenswrapper[4804]: I0217 14:22:43.426008 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8chgg" event={"ID":"dde1b880-fcbe-493d-85e0-44763ee6e1f8","Type":"ContainerStarted","Data":"c32ab401f3ef9117e51efe6faab4692742c641c8ab361c05e3938b5036ba0972"} Feb 17 14:22:44 crc kubenswrapper[4804]: I0217 14:22:44.441444 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8chgg" event={"ID":"dde1b880-fcbe-493d-85e0-44763ee6e1f8","Type":"ContainerStarted","Data":"4cd950d0a4c6def43bb769f210172ec9f68a9326d2f1048069e9534f8fc95f74"} Feb 17 14:22:45 crc kubenswrapper[4804]: I0217 14:22:45.453183 4804 generic.go:334] "Generic (PLEG): container finished" podID="dde1b880-fcbe-493d-85e0-44763ee6e1f8" containerID="4cd950d0a4c6def43bb769f210172ec9f68a9326d2f1048069e9534f8fc95f74" exitCode=0 Feb 17 14:22:45 crc kubenswrapper[4804]: I0217 14:22:45.453256 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8chgg" event={"ID":"dde1b880-fcbe-493d-85e0-44763ee6e1f8","Type":"ContainerDied","Data":"4cd950d0a4c6def43bb769f210172ec9f68a9326d2f1048069e9534f8fc95f74"} Feb 17 14:22:46 crc kubenswrapper[4804]: I0217 14:22:46.464757 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8chgg" event={"ID":"dde1b880-fcbe-493d-85e0-44763ee6e1f8","Type":"ContainerStarted","Data":"5543bfe53658d0bdd76001eb8de9e492dea3f4355d25e5ed27d4c5f8842830fe"} Feb 17 14:22:46 crc kubenswrapper[4804]: I0217 14:22:46.492155 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8chgg" podStartSLOduration=3.07562732 podStartE2EDuration="5.492134866s" podCreationTimestamp="2026-02-17 14:22:41 +0000 UTC" firstStartedPulling="2026-02-17 14:22:43.427821415 +0000 UTC m=+3437.539240752" lastFinishedPulling="2026-02-17 14:22:45.844328961 +0000 UTC m=+3439.955748298" observedRunningTime="2026-02-17 14:22:46.483433743 +0000 UTC m=+3440.594853090" watchObservedRunningTime="2026-02-17 14:22:46.492134866 +0000 UTC m=+3440.603554193" Feb 17 14:22:49 crc kubenswrapper[4804]: I0217 14:22:49.347196 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 17 14:22:49 crc kubenswrapper[4804]: I0217 14:22:49.349047 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 14:22:49 crc kubenswrapper[4804]: I0217 14:22:49.351167 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-kssn4" Feb 17 14:22:49 crc kubenswrapper[4804]: I0217 14:22:49.383532 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 17 14:22:49 crc kubenswrapper[4804]: I0217 14:22:49.466574 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4c6dcbcb-8248-40b5-8fd6-7824c487109e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 14:22:49 crc kubenswrapper[4804]: I0217 14:22:49.466732 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t626\" (UniqueName: \"kubernetes.io/projected/4c6dcbcb-8248-40b5-8fd6-7824c487109e-kube-api-access-5t626\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4c6dcbcb-8248-40b5-8fd6-7824c487109e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 14:22:49 crc kubenswrapper[4804]: I0217 14:22:49.567968 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t626\" (UniqueName: \"kubernetes.io/projected/4c6dcbcb-8248-40b5-8fd6-7824c487109e-kube-api-access-5t626\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4c6dcbcb-8248-40b5-8fd6-7824c487109e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 14:22:49 crc kubenswrapper[4804]: I0217 14:22:49.568056 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4c6dcbcb-8248-40b5-8fd6-7824c487109e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 14:22:49 crc kubenswrapper[4804]: I0217 14:22:49.568455 4804 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4c6dcbcb-8248-40b5-8fd6-7824c487109e\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 14:22:49 crc kubenswrapper[4804]: I0217 14:22:49.595699 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t626\" (UniqueName: \"kubernetes.io/projected/4c6dcbcb-8248-40b5-8fd6-7824c487109e-kube-api-access-5t626\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4c6dcbcb-8248-40b5-8fd6-7824c487109e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 14:22:49 crc kubenswrapper[4804]: I0217 14:22:49.596306 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4c6dcbcb-8248-40b5-8fd6-7824c487109e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 14:22:49 crc kubenswrapper[4804]: I0217 14:22:49.691162 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 14:22:50 crc kubenswrapper[4804]: I0217 14:22:50.130547 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 17 14:22:50 crc kubenswrapper[4804]: W0217 14:22:50.135396 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c6dcbcb_8248_40b5_8fd6_7824c487109e.slice/crio-18a8be274eae3cb9141d7d3f769ba4cf263179c0419d3df5388160a88be480fe WatchSource:0}: Error finding container 18a8be274eae3cb9141d7d3f769ba4cf263179c0419d3df5388160a88be480fe: Status 404 returned error can't find the container with id 18a8be274eae3cb9141d7d3f769ba4cf263179c0419d3df5388160a88be480fe Feb 17 14:22:50 crc kubenswrapper[4804]: I0217 14:22:50.688721 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"4c6dcbcb-8248-40b5-8fd6-7824c487109e","Type":"ContainerStarted","Data":"18a8be274eae3cb9141d7d3f769ba4cf263179c0419d3df5388160a88be480fe"} Feb 17 14:22:52 crc kubenswrapper[4804]: I0217 14:22:52.290531 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8chgg" Feb 17 14:22:52 crc kubenswrapper[4804]: I0217 14:22:52.292235 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8chgg" Feb 17 14:22:52 crc kubenswrapper[4804]: I0217 14:22:52.346129 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8chgg" Feb 17 14:22:52 crc kubenswrapper[4804]: I0217 14:22:52.704387 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"4c6dcbcb-8248-40b5-8fd6-7824c487109e","Type":"ContainerStarted","Data":"63f7e5eaa00772f47394801064b6c0c3f65c0725404e6632fc6fc9a62c196e00"} Feb 17 14:22:52 crc kubenswrapper[4804]: I0217 14:22:52.719484 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.401090714 podStartE2EDuration="3.719468024s" podCreationTimestamp="2026-02-17 14:22:49 +0000 UTC" firstStartedPulling="2026-02-17 14:22:50.1375849 +0000 UTC m=+3444.249004237" lastFinishedPulling="2026-02-17 14:22:52.45596221 +0000 UTC m=+3446.567381547" observedRunningTime="2026-02-17 14:22:52.71747183 +0000 UTC m=+3446.828891187" watchObservedRunningTime="2026-02-17 14:22:52.719468024 +0000 UTC m=+3446.830887361" Feb 17 14:22:52 crc kubenswrapper[4804]: I0217 14:22:52.757386 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8chgg" Feb 17 14:22:52 crc kubenswrapper[4804]: I0217 14:22:52.810167 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8chgg"] Feb 17 14:22:54 crc kubenswrapper[4804]: I0217 14:22:54.722575 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8chgg" podUID="dde1b880-fcbe-493d-85e0-44763ee6e1f8" containerName="registry-server" containerID="cri-o://5543bfe53658d0bdd76001eb8de9e492dea3f4355d25e5ed27d4c5f8842830fe" gracePeriod=2 Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.156468 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8chgg" Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.271186 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dde1b880-fcbe-493d-85e0-44763ee6e1f8-catalog-content\") pod \"dde1b880-fcbe-493d-85e0-44763ee6e1f8\" (UID: \"dde1b880-fcbe-493d-85e0-44763ee6e1f8\") " Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.271333 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dde1b880-fcbe-493d-85e0-44763ee6e1f8-utilities\") pod \"dde1b880-fcbe-493d-85e0-44763ee6e1f8\" (UID: \"dde1b880-fcbe-493d-85e0-44763ee6e1f8\") " Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.271422 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skv7q\" (UniqueName: \"kubernetes.io/projected/dde1b880-fcbe-493d-85e0-44763ee6e1f8-kube-api-access-skv7q\") pod \"dde1b880-fcbe-493d-85e0-44763ee6e1f8\" (UID: \"dde1b880-fcbe-493d-85e0-44763ee6e1f8\") " Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.272247 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dde1b880-fcbe-493d-85e0-44763ee6e1f8-utilities" (OuterVolumeSpecName: "utilities") pod "dde1b880-fcbe-493d-85e0-44763ee6e1f8" (UID: "dde1b880-fcbe-493d-85e0-44763ee6e1f8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.276385 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dde1b880-fcbe-493d-85e0-44763ee6e1f8-kube-api-access-skv7q" (OuterVolumeSpecName: "kube-api-access-skv7q") pod "dde1b880-fcbe-493d-85e0-44763ee6e1f8" (UID: "dde1b880-fcbe-493d-85e0-44763ee6e1f8"). InnerVolumeSpecName "kube-api-access-skv7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.295568 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dde1b880-fcbe-493d-85e0-44763ee6e1f8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dde1b880-fcbe-493d-85e0-44763ee6e1f8" (UID: "dde1b880-fcbe-493d-85e0-44763ee6e1f8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.374086 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dde1b880-fcbe-493d-85e0-44763ee6e1f8-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.374126 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skv7q\" (UniqueName: \"kubernetes.io/projected/dde1b880-fcbe-493d-85e0-44763ee6e1f8-kube-api-access-skv7q\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.374136 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dde1b880-fcbe-493d-85e0-44763ee6e1f8-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.734323 4804 generic.go:334] "Generic (PLEG): container finished" podID="dde1b880-fcbe-493d-85e0-44763ee6e1f8" containerID="5543bfe53658d0bdd76001eb8de9e492dea3f4355d25e5ed27d4c5f8842830fe" exitCode=0 Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.734400 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8chgg" event={"ID":"dde1b880-fcbe-493d-85e0-44763ee6e1f8","Type":"ContainerDied","Data":"5543bfe53658d0bdd76001eb8de9e492dea3f4355d25e5ed27d4c5f8842830fe"} Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.734747 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8chgg" event={"ID":"dde1b880-fcbe-493d-85e0-44763ee6e1f8","Type":"ContainerDied","Data":"c32ab401f3ef9117e51efe6faab4692742c641c8ab361c05e3938b5036ba0972"} Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.734789 4804 scope.go:117] "RemoveContainer" containerID="5543bfe53658d0bdd76001eb8de9e492dea3f4355d25e5ed27d4c5f8842830fe" Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.734419 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8chgg" Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.759663 4804 scope.go:117] "RemoveContainer" containerID="4cd950d0a4c6def43bb769f210172ec9f68a9326d2f1048069e9534f8fc95f74" Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.781856 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8chgg"] Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.790706 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8chgg"] Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.803149 4804 scope.go:117] "RemoveContainer" containerID="4cdc2945c974221ef0ceae74cdf7d55e0de10f77f1b218b174597c608971b348" Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.830711 4804 scope.go:117] "RemoveContainer" containerID="5543bfe53658d0bdd76001eb8de9e492dea3f4355d25e5ed27d4c5f8842830fe" Feb 17 14:22:55 crc kubenswrapper[4804]: E0217 14:22:55.831092 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5543bfe53658d0bdd76001eb8de9e492dea3f4355d25e5ed27d4c5f8842830fe\": container with ID starting with 5543bfe53658d0bdd76001eb8de9e492dea3f4355d25e5ed27d4c5f8842830fe not found: ID does not exist" containerID="5543bfe53658d0bdd76001eb8de9e492dea3f4355d25e5ed27d4c5f8842830fe" Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.831131 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5543bfe53658d0bdd76001eb8de9e492dea3f4355d25e5ed27d4c5f8842830fe"} err="failed to get container status \"5543bfe53658d0bdd76001eb8de9e492dea3f4355d25e5ed27d4c5f8842830fe\": rpc error: code = NotFound desc = could not find container \"5543bfe53658d0bdd76001eb8de9e492dea3f4355d25e5ed27d4c5f8842830fe\": container with ID starting with 5543bfe53658d0bdd76001eb8de9e492dea3f4355d25e5ed27d4c5f8842830fe not found: ID does not exist" Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.831159 4804 scope.go:117] "RemoveContainer" containerID="4cd950d0a4c6def43bb769f210172ec9f68a9326d2f1048069e9534f8fc95f74" Feb 17 14:22:55 crc kubenswrapper[4804]: E0217 14:22:55.831480 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cd950d0a4c6def43bb769f210172ec9f68a9326d2f1048069e9534f8fc95f74\": container with ID starting with 4cd950d0a4c6def43bb769f210172ec9f68a9326d2f1048069e9534f8fc95f74 not found: ID does not exist" containerID="4cd950d0a4c6def43bb769f210172ec9f68a9326d2f1048069e9534f8fc95f74" Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.831559 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cd950d0a4c6def43bb769f210172ec9f68a9326d2f1048069e9534f8fc95f74"} err="failed to get container status \"4cd950d0a4c6def43bb769f210172ec9f68a9326d2f1048069e9534f8fc95f74\": rpc error: code = NotFound desc = could not find container \"4cd950d0a4c6def43bb769f210172ec9f68a9326d2f1048069e9534f8fc95f74\": container with ID starting with 4cd950d0a4c6def43bb769f210172ec9f68a9326d2f1048069e9534f8fc95f74 not found: ID does not exist" Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.831590 4804 scope.go:117] "RemoveContainer" containerID="4cdc2945c974221ef0ceae74cdf7d55e0de10f77f1b218b174597c608971b348" Feb 17 14:22:55 crc kubenswrapper[4804]: E0217 14:22:55.831835 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cdc2945c974221ef0ceae74cdf7d55e0de10f77f1b218b174597c608971b348\": container with ID starting with 4cdc2945c974221ef0ceae74cdf7d55e0de10f77f1b218b174597c608971b348 not found: ID does not exist" containerID="4cdc2945c974221ef0ceae74cdf7d55e0de10f77f1b218b174597c608971b348" Feb 17 14:22:55 crc kubenswrapper[4804]: I0217 14:22:55.831856 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cdc2945c974221ef0ceae74cdf7d55e0de10f77f1b218b174597c608971b348"} err="failed to get container status \"4cdc2945c974221ef0ceae74cdf7d55e0de10f77f1b218b174597c608971b348\": rpc error: code = NotFound desc = could not find container \"4cdc2945c974221ef0ceae74cdf7d55e0de10f77f1b218b174597c608971b348\": container with ID starting with 4cdc2945c974221ef0ceae74cdf7d55e0de10f77f1b218b174597c608971b348 not found: ID does not exist" Feb 17 14:22:56 crc kubenswrapper[4804]: I0217 14:22:56.586486 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dde1b880-fcbe-493d-85e0-44763ee6e1f8" path="/var/lib/kubelet/pods/dde1b880-fcbe-493d-85e0-44763ee6e1f8/volumes" Feb 17 14:23:13 crc kubenswrapper[4804]: I0217 14:23:13.938387 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4k4qm/must-gather-49hd6"] Feb 17 14:23:13 crc kubenswrapper[4804]: E0217 14:23:13.939426 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dde1b880-fcbe-493d-85e0-44763ee6e1f8" containerName="registry-server" Feb 17 14:23:13 crc kubenswrapper[4804]: I0217 14:23:13.939444 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="dde1b880-fcbe-493d-85e0-44763ee6e1f8" containerName="registry-server" Feb 17 14:23:13 crc kubenswrapper[4804]: E0217 14:23:13.939462 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dde1b880-fcbe-493d-85e0-44763ee6e1f8" containerName="extract-utilities" Feb 17 14:23:13 crc kubenswrapper[4804]: I0217 14:23:13.939471 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="dde1b880-fcbe-493d-85e0-44763ee6e1f8" containerName="extract-utilities" Feb 17 14:23:13 crc kubenswrapper[4804]: E0217 14:23:13.939513 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dde1b880-fcbe-493d-85e0-44763ee6e1f8" containerName="extract-content" Feb 17 14:23:13 crc kubenswrapper[4804]: I0217 14:23:13.939521 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="dde1b880-fcbe-493d-85e0-44763ee6e1f8" containerName="extract-content" Feb 17 14:23:13 crc kubenswrapper[4804]: I0217 14:23:13.939729 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="dde1b880-fcbe-493d-85e0-44763ee6e1f8" containerName="registry-server" Feb 17 14:23:13 crc kubenswrapper[4804]: I0217 14:23:13.944245 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4qm/must-gather-49hd6" Feb 17 14:23:13 crc kubenswrapper[4804]: I0217 14:23:13.946598 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-4k4qm"/"default-dockercfg-p49xk" Feb 17 14:23:13 crc kubenswrapper[4804]: I0217 14:23:13.947066 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4k4qm"/"openshift-service-ca.crt" Feb 17 14:23:13 crc kubenswrapper[4804]: I0217 14:23:13.951944 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4k4qm"/"kube-root-ca.crt" Feb 17 14:23:13 crc kubenswrapper[4804]: I0217 14:23:13.951969 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4k4qm/must-gather-49hd6"] Feb 17 14:23:14 crc kubenswrapper[4804]: I0217 14:23:14.060581 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlhtq\" (UniqueName: \"kubernetes.io/projected/dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a-kube-api-access-zlhtq\") pod \"must-gather-49hd6\" (UID: \"dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a\") " pod="openshift-must-gather-4k4qm/must-gather-49hd6" Feb 17 14:23:14 crc kubenswrapper[4804]: I0217 14:23:14.063318 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a-must-gather-output\") pod \"must-gather-49hd6\" (UID: \"dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a\") " pod="openshift-must-gather-4k4qm/must-gather-49hd6" Feb 17 14:23:14 crc kubenswrapper[4804]: I0217 14:23:14.165403 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlhtq\" (UniqueName: \"kubernetes.io/projected/dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a-kube-api-access-zlhtq\") pod \"must-gather-49hd6\" (UID: \"dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a\") " pod="openshift-must-gather-4k4qm/must-gather-49hd6" Feb 17 14:23:14 crc kubenswrapper[4804]: I0217 14:23:14.165481 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a-must-gather-output\") pod \"must-gather-49hd6\" (UID: \"dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a\") " pod="openshift-must-gather-4k4qm/must-gather-49hd6" Feb 17 14:23:14 crc kubenswrapper[4804]: I0217 14:23:14.165994 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a-must-gather-output\") pod \"must-gather-49hd6\" (UID: \"dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a\") " pod="openshift-must-gather-4k4qm/must-gather-49hd6" Feb 17 14:23:14 crc kubenswrapper[4804]: I0217 14:23:14.183338 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlhtq\" (UniqueName: \"kubernetes.io/projected/dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a-kube-api-access-zlhtq\") pod \"must-gather-49hd6\" (UID: \"dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a\") " pod="openshift-must-gather-4k4qm/must-gather-49hd6" Feb 17 14:23:14 crc kubenswrapper[4804]: I0217 14:23:14.266128 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4qm/must-gather-49hd6" Feb 17 14:23:14 crc kubenswrapper[4804]: I0217 14:23:14.756807 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4k4qm/must-gather-49hd6"] Feb 17 14:23:14 crc kubenswrapper[4804]: I0217 14:23:14.926429 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4k4qm/must-gather-49hd6" event={"ID":"dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a","Type":"ContainerStarted","Data":"47b1ad9526c381d40fe9be04bdae4d60f49c32ce0c24d8723fd0ea8eb1b02180"} Feb 17 14:23:21 crc kubenswrapper[4804]: I0217 14:23:21.020382 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4k4qm/must-gather-49hd6" event={"ID":"dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a","Type":"ContainerStarted","Data":"ba2eb522415e52153f760776df650e5d6f8b40d4bd8471402148ec4a34e5b336"} Feb 17 14:23:22 crc kubenswrapper[4804]: I0217 14:23:22.033967 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4k4qm/must-gather-49hd6" event={"ID":"dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a","Type":"ContainerStarted","Data":"7785755ab9c2639f42f1f2d318c08fd3c381e3887af2585aac75256b147c0c8e"} Feb 17 14:23:22 crc kubenswrapper[4804]: I0217 14:23:22.060085 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4k4qm/must-gather-49hd6" podStartSLOduration=3.225402457 podStartE2EDuration="9.060070771s" podCreationTimestamp="2026-02-17 14:23:13 +0000 UTC" firstStartedPulling="2026-02-17 14:23:14.774407057 +0000 UTC m=+3468.885826394" lastFinishedPulling="2026-02-17 14:23:20.609075371 +0000 UTC m=+3474.720494708" observedRunningTime="2026-02-17 14:23:22.053248367 +0000 UTC m=+3476.164667714" watchObservedRunningTime="2026-02-17 14:23:22.060070771 +0000 UTC m=+3476.171490108" Feb 17 14:23:24 crc kubenswrapper[4804]: I0217 14:23:24.591050 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4k4qm/crc-debug-pdlxg"] Feb 17 14:23:24 crc kubenswrapper[4804]: I0217 14:23:24.593022 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4qm/crc-debug-pdlxg" Feb 17 14:23:24 crc kubenswrapper[4804]: I0217 14:23:24.778224 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp2dz\" (UniqueName: \"kubernetes.io/projected/a166cb3c-985b-42bb-943e-5135d68d5827-kube-api-access-qp2dz\") pod \"crc-debug-pdlxg\" (UID: \"a166cb3c-985b-42bb-943e-5135d68d5827\") " pod="openshift-must-gather-4k4qm/crc-debug-pdlxg" Feb 17 14:23:24 crc kubenswrapper[4804]: I0217 14:23:24.778579 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a166cb3c-985b-42bb-943e-5135d68d5827-host\") pod \"crc-debug-pdlxg\" (UID: \"a166cb3c-985b-42bb-943e-5135d68d5827\") " pod="openshift-must-gather-4k4qm/crc-debug-pdlxg" Feb 17 14:23:24 crc kubenswrapper[4804]: I0217 14:23:24.880105 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a166cb3c-985b-42bb-943e-5135d68d5827-host\") pod \"crc-debug-pdlxg\" (UID: \"a166cb3c-985b-42bb-943e-5135d68d5827\") " pod="openshift-must-gather-4k4qm/crc-debug-pdlxg" Feb 17 14:23:24 crc kubenswrapper[4804]: I0217 14:23:24.880242 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a166cb3c-985b-42bb-943e-5135d68d5827-host\") pod \"crc-debug-pdlxg\" (UID: \"a166cb3c-985b-42bb-943e-5135d68d5827\") " pod="openshift-must-gather-4k4qm/crc-debug-pdlxg" Feb 17 14:23:24 crc kubenswrapper[4804]: I0217 14:23:24.880298 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp2dz\" (UniqueName: \"kubernetes.io/projected/a166cb3c-985b-42bb-943e-5135d68d5827-kube-api-access-qp2dz\") pod \"crc-debug-pdlxg\" (UID: \"a166cb3c-985b-42bb-943e-5135d68d5827\") " pod="openshift-must-gather-4k4qm/crc-debug-pdlxg" Feb 17 14:23:24 crc kubenswrapper[4804]: I0217 14:23:24.912100 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp2dz\" (UniqueName: \"kubernetes.io/projected/a166cb3c-985b-42bb-943e-5135d68d5827-kube-api-access-qp2dz\") pod \"crc-debug-pdlxg\" (UID: \"a166cb3c-985b-42bb-943e-5135d68d5827\") " pod="openshift-must-gather-4k4qm/crc-debug-pdlxg" Feb 17 14:23:25 crc kubenswrapper[4804]: I0217 14:23:25.209409 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4qm/crc-debug-pdlxg" Feb 17 14:23:26 crc kubenswrapper[4804]: I0217 14:23:26.091321 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4k4qm/crc-debug-pdlxg" event={"ID":"a166cb3c-985b-42bb-943e-5135d68d5827","Type":"ContainerStarted","Data":"9635daeefc377d00a326a2faedd3da9b6967b384943c47b23979e8593277e41f"} Feb 17 14:23:37 crc kubenswrapper[4804]: I0217 14:23:37.196179 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4k4qm/crc-debug-pdlxg" event={"ID":"a166cb3c-985b-42bb-943e-5135d68d5827","Type":"ContainerStarted","Data":"937494af4bed6ea8a3c15ac225b0965ab5ea21328dc383082b45d9d125e0f418"} Feb 17 14:23:37 crc kubenswrapper[4804]: I0217 14:23:37.216417 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4k4qm/crc-debug-pdlxg" podStartSLOduration=2.221199435 podStartE2EDuration="13.216400325s" podCreationTimestamp="2026-02-17 14:23:24 +0000 UTC" firstStartedPulling="2026-02-17 14:23:25.245342625 +0000 UTC m=+3479.356761962" lastFinishedPulling="2026-02-17 14:23:36.240543515 +0000 UTC m=+3490.351962852" observedRunningTime="2026-02-17 14:23:37.208714634 +0000 UTC m=+3491.320133971" watchObservedRunningTime="2026-02-17 14:23:37.216400325 +0000 UTC m=+3491.327819662" Feb 17 14:24:15 crc kubenswrapper[4804]: I0217 14:24:15.543049 4804 generic.go:334] "Generic (PLEG): container finished" podID="a166cb3c-985b-42bb-943e-5135d68d5827" containerID="937494af4bed6ea8a3c15ac225b0965ab5ea21328dc383082b45d9d125e0f418" exitCode=0 Feb 17 14:24:15 crc kubenswrapper[4804]: I0217 14:24:15.543124 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4k4qm/crc-debug-pdlxg" event={"ID":"a166cb3c-985b-42bb-943e-5135d68d5827","Type":"ContainerDied","Data":"937494af4bed6ea8a3c15ac225b0965ab5ea21328dc383082b45d9d125e0f418"} Feb 17 14:24:16 crc kubenswrapper[4804]: I0217 14:24:16.666184 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4qm/crc-debug-pdlxg" Feb 17 14:24:16 crc kubenswrapper[4804]: I0217 14:24:16.699524 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4k4qm/crc-debug-pdlxg"] Feb 17 14:24:16 crc kubenswrapper[4804]: I0217 14:24:16.708118 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4k4qm/crc-debug-pdlxg"] Feb 17 14:24:16 crc kubenswrapper[4804]: I0217 14:24:16.715593 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp2dz\" (UniqueName: \"kubernetes.io/projected/a166cb3c-985b-42bb-943e-5135d68d5827-kube-api-access-qp2dz\") pod \"a166cb3c-985b-42bb-943e-5135d68d5827\" (UID: \"a166cb3c-985b-42bb-943e-5135d68d5827\") " Feb 17 14:24:16 crc kubenswrapper[4804]: I0217 14:24:16.715668 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a166cb3c-985b-42bb-943e-5135d68d5827-host\") pod \"a166cb3c-985b-42bb-943e-5135d68d5827\" (UID: \"a166cb3c-985b-42bb-943e-5135d68d5827\") " Feb 17 14:24:16 crc kubenswrapper[4804]: I0217 14:24:16.715757 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a166cb3c-985b-42bb-943e-5135d68d5827-host" (OuterVolumeSpecName: "host") pod "a166cb3c-985b-42bb-943e-5135d68d5827" (UID: "a166cb3c-985b-42bb-943e-5135d68d5827"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:24:16 crc kubenswrapper[4804]: I0217 14:24:16.716136 4804 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a166cb3c-985b-42bb-943e-5135d68d5827-host\") on node \"crc\" DevicePath \"\"" Feb 17 14:24:16 crc kubenswrapper[4804]: I0217 14:24:16.739413 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a166cb3c-985b-42bb-943e-5135d68d5827-kube-api-access-qp2dz" (OuterVolumeSpecName: "kube-api-access-qp2dz") pod "a166cb3c-985b-42bb-943e-5135d68d5827" (UID: "a166cb3c-985b-42bb-943e-5135d68d5827"). InnerVolumeSpecName "kube-api-access-qp2dz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:24:16 crc kubenswrapper[4804]: I0217 14:24:16.818247 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qp2dz\" (UniqueName: \"kubernetes.io/projected/a166cb3c-985b-42bb-943e-5135d68d5827-kube-api-access-qp2dz\") on node \"crc\" DevicePath \"\"" Feb 17 14:24:17 crc kubenswrapper[4804]: I0217 14:24:17.562261 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9635daeefc377d00a326a2faedd3da9b6967b384943c47b23979e8593277e41f" Feb 17 14:24:17 crc kubenswrapper[4804]: I0217 14:24:17.562313 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4qm/crc-debug-pdlxg" Feb 17 14:24:17 crc kubenswrapper[4804]: E0217 14:24:17.646859 4804 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda166cb3c_985b_42bb_943e_5135d68d5827.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda166cb3c_985b_42bb_943e_5135d68d5827.slice/crio-9635daeefc377d00a326a2faedd3da9b6967b384943c47b23979e8593277e41f\": RecentStats: unable to find data in memory cache]" Feb 17 14:24:17 crc kubenswrapper[4804]: I0217 14:24:17.889388 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4k4qm/crc-debug-75j4k"] Feb 17 14:24:17 crc kubenswrapper[4804]: E0217 14:24:17.889764 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a166cb3c-985b-42bb-943e-5135d68d5827" containerName="container-00" Feb 17 14:24:17 crc kubenswrapper[4804]: I0217 14:24:17.889777 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="a166cb3c-985b-42bb-943e-5135d68d5827" containerName="container-00" Feb 17 14:24:17 crc kubenswrapper[4804]: I0217 14:24:17.889973 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="a166cb3c-985b-42bb-943e-5135d68d5827" containerName="container-00" Feb 17 14:24:17 crc kubenswrapper[4804]: I0217 14:24:17.890612 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4qm/crc-debug-75j4k" Feb 17 14:24:17 crc kubenswrapper[4804]: I0217 14:24:17.938789 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkjfb\" (UniqueName: \"kubernetes.io/projected/812cc376-c0d4-45d6-9eb0-3500f3bb0ac1-kube-api-access-qkjfb\") pod \"crc-debug-75j4k\" (UID: \"812cc376-c0d4-45d6-9eb0-3500f3bb0ac1\") " pod="openshift-must-gather-4k4qm/crc-debug-75j4k" Feb 17 14:24:17 crc kubenswrapper[4804]: I0217 14:24:17.939338 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/812cc376-c0d4-45d6-9eb0-3500f3bb0ac1-host\") pod \"crc-debug-75j4k\" (UID: \"812cc376-c0d4-45d6-9eb0-3500f3bb0ac1\") " pod="openshift-must-gather-4k4qm/crc-debug-75j4k" Feb 17 14:24:18 crc kubenswrapper[4804]: I0217 14:24:18.041500 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkjfb\" (UniqueName: \"kubernetes.io/projected/812cc376-c0d4-45d6-9eb0-3500f3bb0ac1-kube-api-access-qkjfb\") pod \"crc-debug-75j4k\" (UID: \"812cc376-c0d4-45d6-9eb0-3500f3bb0ac1\") " pod="openshift-must-gather-4k4qm/crc-debug-75j4k" Feb 17 14:24:18 crc kubenswrapper[4804]: I0217 14:24:18.041812 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/812cc376-c0d4-45d6-9eb0-3500f3bb0ac1-host\") pod \"crc-debug-75j4k\" (UID: \"812cc376-c0d4-45d6-9eb0-3500f3bb0ac1\") " pod="openshift-must-gather-4k4qm/crc-debug-75j4k" Feb 17 14:24:18 crc kubenswrapper[4804]: I0217 14:24:18.041893 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/812cc376-c0d4-45d6-9eb0-3500f3bb0ac1-host\") pod \"crc-debug-75j4k\" (UID: \"812cc376-c0d4-45d6-9eb0-3500f3bb0ac1\") " pod="openshift-must-gather-4k4qm/crc-debug-75j4k" Feb 17 14:24:18 crc kubenswrapper[4804]: I0217 14:24:18.071338 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkjfb\" (UniqueName: \"kubernetes.io/projected/812cc376-c0d4-45d6-9eb0-3500f3bb0ac1-kube-api-access-qkjfb\") pod \"crc-debug-75j4k\" (UID: \"812cc376-c0d4-45d6-9eb0-3500f3bb0ac1\") " pod="openshift-must-gather-4k4qm/crc-debug-75j4k" Feb 17 14:24:18 crc kubenswrapper[4804]: I0217 14:24:18.209298 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4qm/crc-debug-75j4k" Feb 17 14:24:18 crc kubenswrapper[4804]: I0217 14:24:18.576379 4804 generic.go:334] "Generic (PLEG): container finished" podID="812cc376-c0d4-45d6-9eb0-3500f3bb0ac1" containerID="c9976e937dce8ae35118888d70c9c2b90975535717d1cc3679c3c08b380920c6" exitCode=0 Feb 17 14:24:18 crc kubenswrapper[4804]: I0217 14:24:18.591716 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a166cb3c-985b-42bb-943e-5135d68d5827" path="/var/lib/kubelet/pods/a166cb3c-985b-42bb-943e-5135d68d5827/volumes" Feb 17 14:24:18 crc kubenswrapper[4804]: I0217 14:24:18.592248 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4k4qm/crc-debug-75j4k" event={"ID":"812cc376-c0d4-45d6-9eb0-3500f3bb0ac1","Type":"ContainerDied","Data":"c9976e937dce8ae35118888d70c9c2b90975535717d1cc3679c3c08b380920c6"} Feb 17 14:24:18 crc kubenswrapper[4804]: I0217 14:24:18.592288 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4k4qm/crc-debug-75j4k" event={"ID":"812cc376-c0d4-45d6-9eb0-3500f3bb0ac1","Type":"ContainerStarted","Data":"92ee573375d647aa71e22c38c26fe16fd05f0f9bc3b131e96b1189ce81afbe11"} Feb 17 14:24:19 crc kubenswrapper[4804]: I0217 14:24:19.023123 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4k4qm/crc-debug-75j4k"] Feb 17 14:24:19 crc kubenswrapper[4804]: I0217 14:24:19.031351 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4k4qm/crc-debug-75j4k"] Feb 17 14:24:19 crc kubenswrapper[4804]: I0217 14:24:19.679639 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4qm/crc-debug-75j4k" Feb 17 14:24:19 crc kubenswrapper[4804]: I0217 14:24:19.782065 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkjfb\" (UniqueName: \"kubernetes.io/projected/812cc376-c0d4-45d6-9eb0-3500f3bb0ac1-kube-api-access-qkjfb\") pod \"812cc376-c0d4-45d6-9eb0-3500f3bb0ac1\" (UID: \"812cc376-c0d4-45d6-9eb0-3500f3bb0ac1\") " Feb 17 14:24:19 crc kubenswrapper[4804]: I0217 14:24:19.782169 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/812cc376-c0d4-45d6-9eb0-3500f3bb0ac1-host\") pod \"812cc376-c0d4-45d6-9eb0-3500f3bb0ac1\" (UID: \"812cc376-c0d4-45d6-9eb0-3500f3bb0ac1\") " Feb 17 14:24:19 crc kubenswrapper[4804]: I0217 14:24:19.782389 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/812cc376-c0d4-45d6-9eb0-3500f3bb0ac1-host" (OuterVolumeSpecName: "host") pod "812cc376-c0d4-45d6-9eb0-3500f3bb0ac1" (UID: "812cc376-c0d4-45d6-9eb0-3500f3bb0ac1"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:24:19 crc kubenswrapper[4804]: I0217 14:24:19.782943 4804 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/812cc376-c0d4-45d6-9eb0-3500f3bb0ac1-host\") on node \"crc\" DevicePath \"\"" Feb 17 14:24:19 crc kubenswrapper[4804]: I0217 14:24:19.789360 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/812cc376-c0d4-45d6-9eb0-3500f3bb0ac1-kube-api-access-qkjfb" (OuterVolumeSpecName: "kube-api-access-qkjfb") pod "812cc376-c0d4-45d6-9eb0-3500f3bb0ac1" (UID: "812cc376-c0d4-45d6-9eb0-3500f3bb0ac1"). InnerVolumeSpecName "kube-api-access-qkjfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:24:19 crc kubenswrapper[4804]: I0217 14:24:19.884995 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkjfb\" (UniqueName: \"kubernetes.io/projected/812cc376-c0d4-45d6-9eb0-3500f3bb0ac1-kube-api-access-qkjfb\") on node \"crc\" DevicePath \"\"" Feb 17 14:24:20 crc kubenswrapper[4804]: I0217 14:24:20.184632 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4k4qm/crc-debug-5h27q"] Feb 17 14:24:20 crc kubenswrapper[4804]: E0217 14:24:20.184986 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="812cc376-c0d4-45d6-9eb0-3500f3bb0ac1" containerName="container-00" Feb 17 14:24:20 crc kubenswrapper[4804]: I0217 14:24:20.184998 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="812cc376-c0d4-45d6-9eb0-3500f3bb0ac1" containerName="container-00" Feb 17 14:24:20 crc kubenswrapper[4804]: I0217 14:24:20.185175 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="812cc376-c0d4-45d6-9eb0-3500f3bb0ac1" containerName="container-00" Feb 17 14:24:20 crc kubenswrapper[4804]: I0217 14:24:20.185702 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4qm/crc-debug-5h27q" Feb 17 14:24:20 crc kubenswrapper[4804]: I0217 14:24:20.292871 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef953a43-c0ed-40e6-9cdc-9fe7596564d5-host\") pod \"crc-debug-5h27q\" (UID: \"ef953a43-c0ed-40e6-9cdc-9fe7596564d5\") " pod="openshift-must-gather-4k4qm/crc-debug-5h27q" Feb 17 14:24:20 crc kubenswrapper[4804]: I0217 14:24:20.293505 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5r9d\" (UniqueName: \"kubernetes.io/projected/ef953a43-c0ed-40e6-9cdc-9fe7596564d5-kube-api-access-j5r9d\") pod \"crc-debug-5h27q\" (UID: \"ef953a43-c0ed-40e6-9cdc-9fe7596564d5\") " pod="openshift-must-gather-4k4qm/crc-debug-5h27q" Feb 17 14:24:20 crc kubenswrapper[4804]: I0217 14:24:20.396055 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef953a43-c0ed-40e6-9cdc-9fe7596564d5-host\") pod \"crc-debug-5h27q\" (UID: \"ef953a43-c0ed-40e6-9cdc-9fe7596564d5\") " pod="openshift-must-gather-4k4qm/crc-debug-5h27q" Feb 17 14:24:20 crc kubenswrapper[4804]: I0217 14:24:20.396157 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5r9d\" (UniqueName: \"kubernetes.io/projected/ef953a43-c0ed-40e6-9cdc-9fe7596564d5-kube-api-access-j5r9d\") pod \"crc-debug-5h27q\" (UID: \"ef953a43-c0ed-40e6-9cdc-9fe7596564d5\") " pod="openshift-must-gather-4k4qm/crc-debug-5h27q" Feb 17 14:24:20 crc kubenswrapper[4804]: I0217 14:24:20.396361 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef953a43-c0ed-40e6-9cdc-9fe7596564d5-host\") pod \"crc-debug-5h27q\" (UID: \"ef953a43-c0ed-40e6-9cdc-9fe7596564d5\") " pod="openshift-must-gather-4k4qm/crc-debug-5h27q" Feb 17 14:24:20 crc kubenswrapper[4804]: I0217 14:24:20.412665 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5r9d\" (UniqueName: \"kubernetes.io/projected/ef953a43-c0ed-40e6-9cdc-9fe7596564d5-kube-api-access-j5r9d\") pod \"crc-debug-5h27q\" (UID: \"ef953a43-c0ed-40e6-9cdc-9fe7596564d5\") " pod="openshift-must-gather-4k4qm/crc-debug-5h27q" Feb 17 14:24:20 crc kubenswrapper[4804]: I0217 14:24:20.502359 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4qm/crc-debug-5h27q" Feb 17 14:24:20 crc kubenswrapper[4804]: W0217 14:24:20.530077 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef953a43_c0ed_40e6_9cdc_9fe7596564d5.slice/crio-b76ccb7c3019aa9ed072f4cec43e774660c7cfa815cf23720109110b270b4324 WatchSource:0}: Error finding container b76ccb7c3019aa9ed072f4cec43e774660c7cfa815cf23720109110b270b4324: Status 404 returned error can't find the container with id b76ccb7c3019aa9ed072f4cec43e774660c7cfa815cf23720109110b270b4324 Feb 17 14:24:20 crc kubenswrapper[4804]: I0217 14:24:20.598312 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="812cc376-c0d4-45d6-9eb0-3500f3bb0ac1" path="/var/lib/kubelet/pods/812cc376-c0d4-45d6-9eb0-3500f3bb0ac1/volumes" Feb 17 14:24:20 crc kubenswrapper[4804]: I0217 14:24:20.602398 4804 scope.go:117] "RemoveContainer" containerID="c9976e937dce8ae35118888d70c9c2b90975535717d1cc3679c3c08b380920c6" Feb 17 14:24:20 crc kubenswrapper[4804]: I0217 14:24:20.602550 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4qm/crc-debug-75j4k" Feb 17 14:24:20 crc kubenswrapper[4804]: I0217 14:24:20.610479 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4k4qm/crc-debug-5h27q" event={"ID":"ef953a43-c0ed-40e6-9cdc-9fe7596564d5","Type":"ContainerStarted","Data":"b76ccb7c3019aa9ed072f4cec43e774660c7cfa815cf23720109110b270b4324"} Feb 17 14:24:21 crc kubenswrapper[4804]: I0217 14:24:21.622316 4804 generic.go:334] "Generic (PLEG): container finished" podID="ef953a43-c0ed-40e6-9cdc-9fe7596564d5" containerID="a5e0d360afd08e835a1b932952aecfeb3e5f4e1f58b1f3c9f05af31078c78de7" exitCode=0 Feb 17 14:24:21 crc kubenswrapper[4804]: I0217 14:24:21.622427 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4k4qm/crc-debug-5h27q" event={"ID":"ef953a43-c0ed-40e6-9cdc-9fe7596564d5","Type":"ContainerDied","Data":"a5e0d360afd08e835a1b932952aecfeb3e5f4e1f58b1f3c9f05af31078c78de7"} Feb 17 14:24:21 crc kubenswrapper[4804]: I0217 14:24:21.664997 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4k4qm/crc-debug-5h27q"] Feb 17 14:24:21 crc kubenswrapper[4804]: I0217 14:24:21.673244 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4k4qm/crc-debug-5h27q"] Feb 17 14:24:22 crc kubenswrapper[4804]: I0217 14:24:22.746589 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4qm/crc-debug-5h27q" Feb 17 14:24:22 crc kubenswrapper[4804]: I0217 14:24:22.850593 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef953a43-c0ed-40e6-9cdc-9fe7596564d5-host\") pod \"ef953a43-c0ed-40e6-9cdc-9fe7596564d5\" (UID: \"ef953a43-c0ed-40e6-9cdc-9fe7596564d5\") " Feb 17 14:24:22 crc kubenswrapper[4804]: I0217 14:24:22.850699 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5r9d\" (UniqueName: \"kubernetes.io/projected/ef953a43-c0ed-40e6-9cdc-9fe7596564d5-kube-api-access-j5r9d\") pod \"ef953a43-c0ed-40e6-9cdc-9fe7596564d5\" (UID: \"ef953a43-c0ed-40e6-9cdc-9fe7596564d5\") " Feb 17 14:24:22 crc kubenswrapper[4804]: I0217 14:24:22.850732 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef953a43-c0ed-40e6-9cdc-9fe7596564d5-host" (OuterVolumeSpecName: "host") pod "ef953a43-c0ed-40e6-9cdc-9fe7596564d5" (UID: "ef953a43-c0ed-40e6-9cdc-9fe7596564d5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:24:22 crc kubenswrapper[4804]: I0217 14:24:22.851056 4804 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef953a43-c0ed-40e6-9cdc-9fe7596564d5-host\") on node \"crc\" DevicePath \"\"" Feb 17 14:24:22 crc kubenswrapper[4804]: I0217 14:24:22.863948 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef953a43-c0ed-40e6-9cdc-9fe7596564d5-kube-api-access-j5r9d" (OuterVolumeSpecName: "kube-api-access-j5r9d") pod "ef953a43-c0ed-40e6-9cdc-9fe7596564d5" (UID: "ef953a43-c0ed-40e6-9cdc-9fe7596564d5"). InnerVolumeSpecName "kube-api-access-j5r9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:24:22 crc kubenswrapper[4804]: I0217 14:24:22.952961 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5r9d\" (UniqueName: \"kubernetes.io/projected/ef953a43-c0ed-40e6-9cdc-9fe7596564d5-kube-api-access-j5r9d\") on node \"crc\" DevicePath \"\"" Feb 17 14:24:23 crc kubenswrapper[4804]: I0217 14:24:23.642223 4804 scope.go:117] "RemoveContainer" containerID="a5e0d360afd08e835a1b932952aecfeb3e5f4e1f58b1f3c9f05af31078c78de7" Feb 17 14:24:23 crc kubenswrapper[4804]: I0217 14:24:23.642277 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4qm/crc-debug-5h27q" Feb 17 14:24:24 crc kubenswrapper[4804]: I0217 14:24:24.586250 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef953a43-c0ed-40e6-9cdc-9fe7596564d5" path="/var/lib/kubelet/pods/ef953a43-c0ed-40e6-9cdc-9fe7596564d5/volumes" Feb 17 14:24:36 crc kubenswrapper[4804]: I0217 14:24:36.744816 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fbh6b"] Feb 17 14:24:36 crc kubenswrapper[4804]: E0217 14:24:36.745730 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef953a43-c0ed-40e6-9cdc-9fe7596564d5" containerName="container-00" Feb 17 14:24:36 crc kubenswrapper[4804]: I0217 14:24:36.745743 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef953a43-c0ed-40e6-9cdc-9fe7596564d5" containerName="container-00" Feb 17 14:24:36 crc kubenswrapper[4804]: I0217 14:24:36.745936 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef953a43-c0ed-40e6-9cdc-9fe7596564d5" containerName="container-00" Feb 17 14:24:36 crc kubenswrapper[4804]: I0217 14:24:36.747166 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fbh6b" Feb 17 14:24:36 crc kubenswrapper[4804]: I0217 14:24:36.761796 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fbh6b"] Feb 17 14:24:36 crc kubenswrapper[4804]: I0217 14:24:36.816835 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9-utilities\") pod \"community-operators-fbh6b\" (UID: \"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9\") " pod="openshift-marketplace/community-operators-fbh6b" Feb 17 14:24:36 crc kubenswrapper[4804]: I0217 14:24:36.816991 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwxbs\" (UniqueName: \"kubernetes.io/projected/f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9-kube-api-access-gwxbs\") pod \"community-operators-fbh6b\" (UID: \"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9\") " pod="openshift-marketplace/community-operators-fbh6b" Feb 17 14:24:36 crc kubenswrapper[4804]: I0217 14:24:36.817043 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9-catalog-content\") pod \"community-operators-fbh6b\" (UID: \"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9\") " pod="openshift-marketplace/community-operators-fbh6b" Feb 17 14:24:36 crc kubenswrapper[4804]: I0217 14:24:36.918286 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9-utilities\") pod \"community-operators-fbh6b\" (UID: \"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9\") " pod="openshift-marketplace/community-operators-fbh6b" Feb 17 14:24:36 crc kubenswrapper[4804]: I0217 14:24:36.918413 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwxbs\" (UniqueName: \"kubernetes.io/projected/f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9-kube-api-access-gwxbs\") pod \"community-operators-fbh6b\" (UID: \"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9\") " pod="openshift-marketplace/community-operators-fbh6b" Feb 17 14:24:36 crc kubenswrapper[4804]: I0217 14:24:36.918443 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9-catalog-content\") pod \"community-operators-fbh6b\" (UID: \"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9\") " pod="openshift-marketplace/community-operators-fbh6b" Feb 17 14:24:36 crc kubenswrapper[4804]: I0217 14:24:36.918883 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9-catalog-content\") pod \"community-operators-fbh6b\" (UID: \"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9\") " pod="openshift-marketplace/community-operators-fbh6b" Feb 17 14:24:36 crc kubenswrapper[4804]: I0217 14:24:36.918894 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9-utilities\") pod \"community-operators-fbh6b\" (UID: \"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9\") " pod="openshift-marketplace/community-operators-fbh6b" Feb 17 14:24:36 crc kubenswrapper[4804]: I0217 14:24:36.946921 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwxbs\" (UniqueName: \"kubernetes.io/projected/f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9-kube-api-access-gwxbs\") pod \"community-operators-fbh6b\" (UID: \"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9\") " pod="openshift-marketplace/community-operators-fbh6b" Feb 17 14:24:37 crc kubenswrapper[4804]: I0217 14:24:37.063841 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fbh6b" Feb 17 14:24:37 crc kubenswrapper[4804]: I0217 14:24:37.621042 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fbh6b"] Feb 17 14:24:37 crc kubenswrapper[4804]: I0217 14:24:37.776836 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fbh6b" event={"ID":"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9","Type":"ContainerStarted","Data":"e476be034a124bd37543c8e10bdc0da87c6541a84b1b15edb997e8e42215c690"} Feb 17 14:24:38 crc kubenswrapper[4804]: I0217 14:24:38.260622 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7cc7c97fdd-bhd7w_2b89da32-9537-4c7b-a266-0d38ac52b069/barbican-api/0.log" Feb 17 14:24:38 crc kubenswrapper[4804]: I0217 14:24:38.415293 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7cc7c97fdd-bhd7w_2b89da32-9537-4c7b-a266-0d38ac52b069/barbican-api-log/0.log" Feb 17 14:24:38 crc kubenswrapper[4804]: I0217 14:24:38.456095 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-f46489f4-x24zj_297a0648-3cbd-4f1e-8bc4-d918a702c33b/barbican-keystone-listener/0.log" Feb 17 14:24:38 crc kubenswrapper[4804]: I0217 14:24:38.517833 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-f46489f4-x24zj_297a0648-3cbd-4f1e-8bc4-d918a702c33b/barbican-keystone-listener-log/0.log" Feb 17 14:24:38 crc kubenswrapper[4804]: I0217 14:24:38.706491 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5f97f9545f-tngcj_c7f4e4c3-9ec8-4923-bf7b-4058899e863f/barbican-worker/0.log" Feb 17 14:24:38 crc kubenswrapper[4804]: I0217 14:24:38.728156 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5f97f9545f-tngcj_c7f4e4c3-9ec8-4923-bf7b-4058899e863f/barbican-worker-log/0.log" Feb 17 14:24:38 crc kubenswrapper[4804]: I0217 14:24:38.787338 4804 generic.go:334] "Generic (PLEG): container finished" podID="f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9" containerID="0ee7a09a9548612157cd0499cb44d44f1bc48d1f96bb83d6a4a27122857dc493" exitCode=0 Feb 17 14:24:38 crc kubenswrapper[4804]: I0217 14:24:38.787383 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fbh6b" event={"ID":"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9","Type":"ContainerDied","Data":"0ee7a09a9548612157cd0499cb44d44f1bc48d1f96bb83d6a4a27122857dc493"} Feb 17 14:24:38 crc kubenswrapper[4804]: I0217 14:24:38.884495 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p_9ee075c2-2363-4446-8545-dfdece6ca4da/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:24:38 crc kubenswrapper[4804]: I0217 14:24:38.988612 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_39bfc426-b9af-40b4-a713-26bb2366db7a/ceilometer-central-agent/0.log" Feb 17 14:24:39 crc kubenswrapper[4804]: I0217 14:24:39.033750 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_39bfc426-b9af-40b4-a713-26bb2366db7a/ceilometer-notification-agent/0.log" Feb 17 14:24:39 crc kubenswrapper[4804]: I0217 14:24:39.101966 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_39bfc426-b9af-40b4-a713-26bb2366db7a/proxy-httpd/0.log" Feb 17 14:24:39 crc kubenswrapper[4804]: I0217 14:24:39.132112 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_39bfc426-b9af-40b4-a713-26bb2366db7a/sg-core/0.log" Feb 17 14:24:39 crc kubenswrapper[4804]: I0217 14:24:39.301849 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92/cinder-api/0.log" Feb 17 14:24:39 crc kubenswrapper[4804]: I0217 14:24:39.318406 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92/cinder-api-log/0.log" Feb 17 14:24:39 crc kubenswrapper[4804]: I0217 14:24:39.459299 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_f7170af0-a08f-4b96-b93a-5353d633a82f/cinder-scheduler/0.log" Feb 17 14:24:39 crc kubenswrapper[4804]: I0217 14:24:39.603148 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-499xq_5c4e88aa-842f-453a-9ce9-8354c16340e9/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:24:39 crc kubenswrapper[4804]: I0217 14:24:39.616855 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_f7170af0-a08f-4b96-b93a-5353d633a82f/probe/0.log" Feb 17 14:24:39 crc kubenswrapper[4804]: I0217 14:24:39.798523 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fbh6b" event={"ID":"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9","Type":"ContainerStarted","Data":"da88800c0e4a215feb5506ae2c2cc2a511a3c63fb8b3fde64213461f864029df"} Feb 17 14:24:39 crc kubenswrapper[4804]: I0217 14:24:39.832516 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-2n5kn_69619ab8-5a40-43b9-8e9c-1a6e39893605/init/0.log" Feb 17 14:24:39 crc kubenswrapper[4804]: I0217 14:24:39.892060 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq_5ca70007-e938-4bd5-9f2a-66f18b87743a/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:24:40 crc kubenswrapper[4804]: I0217 14:24:40.071046 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-2n5kn_69619ab8-5a40-43b9-8e9c-1a6e39893605/dnsmasq-dns/0.log" Feb 17 14:24:40 crc kubenswrapper[4804]: I0217 14:24:40.104261 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-2n5kn_69619ab8-5a40-43b9-8e9c-1a6e39893605/init/0.log" Feb 17 14:24:40 crc kubenswrapper[4804]: I0217 14:24:40.116482 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc_5ecc3e55-21c0-4017-8dce-9c77fd2189ea/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:24:40 crc kubenswrapper[4804]: I0217 14:24:40.304947 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_cc2e7136-825b-4608-a106-944f359c7369/glance-log/0.log" Feb 17 14:24:40 crc kubenswrapper[4804]: I0217 14:24:40.500010 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_52f268a5-3c72-4655-bb36-823c34e5312d/glance-log/0.log" Feb 17 14:24:40 crc kubenswrapper[4804]: I0217 14:24:40.622600 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_cc2e7136-825b-4608-a106-944f359c7369/glance-httpd/0.log" Feb 17 14:24:40 crc kubenswrapper[4804]: I0217 14:24:40.625636 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_52f268a5-3c72-4655-bb36-823c34e5312d/glance-httpd/0.log" Feb 17 14:24:40 crc kubenswrapper[4804]: I0217 14:24:40.732014 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-9ffb6f5c6-fczv5_e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f/horizon/0.log" Feb 17 14:24:40 crc kubenswrapper[4804]: I0217 14:24:40.807604 4804 generic.go:334] "Generic (PLEG): container finished" podID="f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9" containerID="da88800c0e4a215feb5506ae2c2cc2a511a3c63fb8b3fde64213461f864029df" exitCode=0 Feb 17 14:24:40 crc kubenswrapper[4804]: I0217 14:24:40.807649 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fbh6b" event={"ID":"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9","Type":"ContainerDied","Data":"da88800c0e4a215feb5506ae2c2cc2a511a3c63fb8b3fde64213461f864029df"} Feb 17 14:24:40 crc kubenswrapper[4804]: I0217 14:24:40.818922 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-65nc8_0a55b597-4920-4fa6-99d5-6deaa6f30a4a/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:24:41 crc kubenswrapper[4804]: I0217 14:24:41.019114 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-9ffb6f5c6-fczv5_e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f/horizon-log/0.log" Feb 17 14:24:41 crc kubenswrapper[4804]: I0217 14:24:41.090873 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-hx4nm_e9b53a85-8a87-4b65-8832-00c4175da541/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:24:41 crc kubenswrapper[4804]: I0217 14:24:41.369849 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-9cc757857-wng6k_30df70d3-9323-4ddd-9d1c-2dae72cff6d9/keystone-api/0.log" Feb 17 14:24:41 crc kubenswrapper[4804]: I0217 14:24:41.404455 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29522281-k9ptv_c2d1f319-5d08-4969-a968-45eba20958a7/keystone-cron/0.log" Feb 17 14:24:41 crc kubenswrapper[4804]: I0217 14:24:41.543349 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_d6aabf20-b0bf-4f35-aec7-098f38bacfd9/kube-state-metrics/0.log" Feb 17 14:24:41 crc kubenswrapper[4804]: I0217 14:24:41.685772 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc_c0aad2ba-98cf-42b5-9c03-40633fb8ac18/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:24:41 crc kubenswrapper[4804]: I0217 14:24:41.817847 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fbh6b" event={"ID":"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9","Type":"ContainerStarted","Data":"f6de235db8c65c1e1f660597bf3cdda3eea0fa509f4d751818543e3e71cd9431"} Feb 17 14:24:41 crc kubenswrapper[4804]: I0217 14:24:41.844235 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fbh6b" podStartSLOduration=3.427210233 podStartE2EDuration="5.844211475s" podCreationTimestamp="2026-02-17 14:24:36 +0000 UTC" firstStartedPulling="2026-02-17 14:24:38.789549457 +0000 UTC m=+3552.900968794" lastFinishedPulling="2026-02-17 14:24:41.206550699 +0000 UTC m=+3555.317970036" observedRunningTime="2026-02-17 14:24:41.841574282 +0000 UTC m=+3555.952993619" watchObservedRunningTime="2026-02-17 14:24:41.844211475 +0000 UTC m=+3555.955630822" Feb 17 14:24:42 crc kubenswrapper[4804]: I0217 14:24:42.051580 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-c576cfd85-655nj_fb86b3d7-c6a3-43d5-a8da-805aa7d73a66/neutron-api/0.log" Feb 17 14:24:42 crc kubenswrapper[4804]: I0217 14:24:42.162781 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-c576cfd85-655nj_fb86b3d7-c6a3-43d5-a8da-805aa7d73a66/neutron-httpd/0.log" Feb 17 14:24:42 crc kubenswrapper[4804]: I0217 14:24:42.331715 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg_84938cd5-694c-423a-a0d1-801f28085377/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:24:43 crc kubenswrapper[4804]: I0217 14:24:43.156122 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_fc78e86d-494e-417b-8569-b564cdbd069a/nova-cell0-conductor-conductor/0.log" Feb 17 14:24:43 crc kubenswrapper[4804]: I0217 14:24:43.162045 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_29528202-42d5-4bcd-90e8-335435ba59cf/nova-api-log/0.log" Feb 17 14:24:43 crc kubenswrapper[4804]: I0217 14:24:43.319039 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_29528202-42d5-4bcd-90e8-335435ba59cf/nova-api-api/0.log" Feb 17 14:24:43 crc kubenswrapper[4804]: I0217 14:24:43.467069 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_a13dbc73-75fc-448b-af44-cb7018d1640e/nova-cell1-conductor-conductor/0.log" Feb 17 14:24:43 crc kubenswrapper[4804]: I0217 14:24:43.564696 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_5c380610-c164-4798-a5df-9b90fd475667/nova-cell1-novncproxy-novncproxy/0.log" Feb 17 14:24:43 crc kubenswrapper[4804]: I0217 14:24:43.954907 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-x8lml_9f17dd92-0402-40c7-bdc7-50b38e37f750/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:24:44 crc kubenswrapper[4804]: I0217 14:24:44.021516 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_ee4c15c1-5fb0-4605-9cb8-69a060ec0d39/nova-metadata-log/0.log" Feb 17 14:24:44 crc kubenswrapper[4804]: I0217 14:24:44.382664 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_1bac289d-58a7-4e23-8805-c48811d12d32/nova-scheduler-scheduler/0.log" Feb 17 14:24:44 crc kubenswrapper[4804]: I0217 14:24:44.395982 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f9eb8e8f-8bd1-4f69-84ee-27213046c709/mysql-bootstrap/0.log" Feb 17 14:24:44 crc kubenswrapper[4804]: I0217 14:24:44.589843 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f9eb8e8f-8bd1-4f69-84ee-27213046c709/mysql-bootstrap/0.log" Feb 17 14:24:44 crc kubenswrapper[4804]: I0217 14:24:44.644725 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f9eb8e8f-8bd1-4f69-84ee-27213046c709/galera/0.log" Feb 17 14:24:44 crc kubenswrapper[4804]: I0217 14:24:44.799075 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_49b02c8f-ff07-48f9-8012-e78dc6591499/mysql-bootstrap/0.log" Feb 17 14:24:44 crc kubenswrapper[4804]: I0217 14:24:44.985605 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_49b02c8f-ff07-48f9-8012-e78dc6591499/mysql-bootstrap/0.log" Feb 17 14:24:45 crc kubenswrapper[4804]: I0217 14:24:45.032794 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_49b02c8f-ff07-48f9-8012-e78dc6591499/galera/0.log" Feb 17 14:24:45 crc kubenswrapper[4804]: I0217 14:24:45.048568 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_ee4c15c1-5fb0-4605-9cb8-69a060ec0d39/nova-metadata-metadata/0.log" Feb 17 14:24:45 crc kubenswrapper[4804]: I0217 14:24:45.194563 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_de1a53e3-68ce-4ecd-9c0a-80ffce568891/openstackclient/0.log" Feb 17 14:24:45 crc kubenswrapper[4804]: I0217 14:24:45.284014 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-4s7l5_d286aa08-b0df-44e8-9128-f596f4b44db8/openstack-network-exporter/0.log" Feb 17 14:24:45 crc kubenswrapper[4804]: I0217 14:24:45.417820 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-p4wrm_45330d20-989c-4507-ae57-5beaee075484/ovsdb-server-init/0.log" Feb 17 14:24:45 crc kubenswrapper[4804]: I0217 14:24:45.654324 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-p4wrm_45330d20-989c-4507-ae57-5beaee075484/ovs-vswitchd/0.log" Feb 17 14:24:45 crc kubenswrapper[4804]: I0217 14:24:45.709692 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-p4wrm_45330d20-989c-4507-ae57-5beaee075484/ovsdb-server/0.log" Feb 17 14:24:45 crc kubenswrapper[4804]: I0217 14:24:45.757378 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-p4wrm_45330d20-989c-4507-ae57-5beaee075484/ovsdb-server-init/0.log" Feb 17 14:24:45 crc kubenswrapper[4804]: I0217 14:24:45.920547 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-rzcfd_9c049787-03d2-4679-8705-ec2cd1ad8141/ovn-controller/0.log" Feb 17 14:24:45 crc kubenswrapper[4804]: I0217 14:24:45.932910 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-v478m_be98213b-0510-4f69-9d98-81363c04d8bd/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:24:46 crc kubenswrapper[4804]: I0217 14:24:46.093214 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3e322ccb-33cf-466f-91fb-63781bdcffb6/openstack-network-exporter/0.log" Feb 17 14:24:46 crc kubenswrapper[4804]: I0217 14:24:46.167413 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3e322ccb-33cf-466f-91fb-63781bdcffb6/ovn-northd/0.log" Feb 17 14:24:46 crc kubenswrapper[4804]: I0217 14:24:46.239019 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0fc5c8da-b323-4afb-aa47-125fc63caefd/openstack-network-exporter/0.log" Feb 17 14:24:46 crc kubenswrapper[4804]: I0217 14:24:46.292769 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0fc5c8da-b323-4afb-aa47-125fc63caefd/ovsdbserver-nb/0.log" Feb 17 14:24:46 crc kubenswrapper[4804]: I0217 14:24:46.454490 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_10e1124a-f402-422d-a906-8d22c90d4abe/openstack-network-exporter/0.log" Feb 17 14:24:46 crc kubenswrapper[4804]: I0217 14:24:46.467115 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_10e1124a-f402-422d-a906-8d22c90d4abe/ovsdbserver-sb/0.log" Feb 17 14:24:46 crc kubenswrapper[4804]: I0217 14:24:46.696621 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6d69649784-lnwhw_858d67cb-268b-4724-bba9-a7ab9a10ed6c/placement-api/0.log" Feb 17 14:24:46 crc kubenswrapper[4804]: I0217 14:24:46.798974 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4c7ecd09-cd15-439d-9153-b55d9013bb83/setup-container/0.log" Feb 17 14:24:46 crc kubenswrapper[4804]: I0217 14:24:46.800060 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6d69649784-lnwhw_858d67cb-268b-4724-bba9-a7ab9a10ed6c/placement-log/0.log" Feb 17 14:24:46 crc kubenswrapper[4804]: I0217 14:24:46.965211 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4c7ecd09-cd15-439d-9153-b55d9013bb83/setup-container/0.log" Feb 17 14:24:47 crc kubenswrapper[4804]: I0217 14:24:47.012931 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b5f204e4-3b7a-4490-9c78-def5bf30f810/setup-container/0.log" Feb 17 14:24:47 crc kubenswrapper[4804]: I0217 14:24:47.045237 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4c7ecd09-cd15-439d-9153-b55d9013bb83/rabbitmq/0.log" Feb 17 14:24:47 crc kubenswrapper[4804]: I0217 14:24:47.065261 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fbh6b" Feb 17 14:24:47 crc kubenswrapper[4804]: I0217 14:24:47.065448 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fbh6b" Feb 17 14:24:47 crc kubenswrapper[4804]: I0217 14:24:47.120767 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fbh6b" Feb 17 14:24:47 crc kubenswrapper[4804]: I0217 14:24:47.252401 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b5f204e4-3b7a-4490-9c78-def5bf30f810/setup-container/0.log" Feb 17 14:24:47 crc kubenswrapper[4804]: I0217 14:24:47.262083 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66_100d84c5-396c-4772-af09-2e223e72a640/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:24:47 crc kubenswrapper[4804]: I0217 14:24:47.270682 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b5f204e4-3b7a-4490-9c78-def5bf30f810/rabbitmq/0.log" Feb 17 14:24:47 crc kubenswrapper[4804]: I0217 14:24:47.473435 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-z6s9f_c87b0376-c505-452b-90ed-0e6bb7e6e8e0/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:24:47 crc kubenswrapper[4804]: I0217 14:24:47.525908 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-zctst_ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:24:47 crc kubenswrapper[4804]: I0217 14:24:47.690626 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-rf97c_01fe0e44-6604-4e17-bcb4-05f202508fc7/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:24:47 crc kubenswrapper[4804]: I0217 14:24:47.729854 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-9jrnh_cdb9b3eb-f3d1-4a32-8a87-b0f686cad260/ssh-known-hosts-edpm-deployment/0.log" Feb 17 14:24:47 crc kubenswrapper[4804]: I0217 14:24:47.921530 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fbh6b" Feb 17 14:24:47 crc kubenswrapper[4804]: I0217 14:24:47.970085 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-59cfdfc65f-48l6n_be0372d3-4646-46e7-af04-6977a7426f35/proxy-server/0.log" Feb 17 14:24:47 crc kubenswrapper[4804]: I0217 14:24:47.980522 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fbh6b"] Feb 17 14:24:48 crc kubenswrapper[4804]: I0217 14:24:48.046800 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-59cfdfc65f-48l6n_be0372d3-4646-46e7-af04-6977a7426f35/proxy-httpd/0.log" Feb 17 14:24:48 crc kubenswrapper[4804]: I0217 14:24:48.194424 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-mv8w5_41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2/swift-ring-rebalance/0.log" Feb 17 14:24:48 crc kubenswrapper[4804]: I0217 14:24:48.228302 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/account-reaper/0.log" Feb 17 14:24:48 crc kubenswrapper[4804]: I0217 14:24:48.278027 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/account-auditor/0.log" Feb 17 14:24:48 crc kubenswrapper[4804]: I0217 14:24:48.435647 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/account-replicator/0.log" Feb 17 14:24:48 crc kubenswrapper[4804]: I0217 14:24:48.448539 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/account-server/0.log" Feb 17 14:24:48 crc kubenswrapper[4804]: I0217 14:24:48.509947 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/container-auditor/0.log" Feb 17 14:24:48 crc kubenswrapper[4804]: I0217 14:24:48.543169 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/container-replicator/0.log" Feb 17 14:24:48 crc kubenswrapper[4804]: I0217 14:24:48.635452 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/container-updater/0.log" Feb 17 14:24:48 crc kubenswrapper[4804]: I0217 14:24:48.656926 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/container-server/0.log" Feb 17 14:24:48 crc kubenswrapper[4804]: I0217 14:24:48.779572 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/object-auditor/0.log" Feb 17 14:24:48 crc kubenswrapper[4804]: I0217 14:24:48.825176 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/object-expirer/0.log" Feb 17 14:24:48 crc kubenswrapper[4804]: I0217 14:24:48.900337 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/object-replicator/0.log" Feb 17 14:24:48 crc kubenswrapper[4804]: I0217 14:24:48.981522 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/object-server/0.log" Feb 17 14:24:49 crc kubenswrapper[4804]: I0217 14:24:49.028836 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/object-updater/0.log" Feb 17 14:24:49 crc kubenswrapper[4804]: I0217 14:24:49.032439 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/rsync/0.log" Feb 17 14:24:49 crc kubenswrapper[4804]: I0217 14:24:49.146458 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/swift-recon-cron/0.log" Feb 17 14:24:49 crc kubenswrapper[4804]: I0217 14:24:49.304848 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-wtq55_0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:24:49 crc kubenswrapper[4804]: I0217 14:24:49.386008 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_f7b246dc-1d07-4725-b471-88fe82584d24/tempest-tests-tempest-tests-runner/0.log" Feb 17 14:24:49 crc kubenswrapper[4804]: I0217 14:24:49.527669 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_4c6dcbcb-8248-40b5-8fd6-7824c487109e/test-operator-logs-container/0.log" Feb 17 14:24:49 crc kubenswrapper[4804]: I0217 14:24:49.666855 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb_ed6642bc-b49f-4e17-a721-b3eae09246aa/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:24:49 crc kubenswrapper[4804]: I0217 14:24:49.881574 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fbh6b" podUID="f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9" containerName="registry-server" containerID="cri-o://f6de235db8c65c1e1f660597bf3cdda3eea0fa509f4d751818543e3e71cd9431" gracePeriod=2 Feb 17 14:24:50 crc kubenswrapper[4804]: I0217 14:24:50.408388 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fbh6b" Feb 17 14:24:50 crc kubenswrapper[4804]: I0217 14:24:50.566110 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9-catalog-content\") pod \"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9\" (UID: \"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9\") " Feb 17 14:24:50 crc kubenswrapper[4804]: I0217 14:24:50.566307 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwxbs\" (UniqueName: \"kubernetes.io/projected/f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9-kube-api-access-gwxbs\") pod \"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9\" (UID: \"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9\") " Feb 17 14:24:50 crc kubenswrapper[4804]: I0217 14:24:50.566365 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9-utilities\") pod \"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9\" (UID: \"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9\") " Feb 17 14:24:50 crc kubenswrapper[4804]: I0217 14:24:50.567394 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9-utilities" (OuterVolumeSpecName: "utilities") pod "f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9" (UID: "f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:24:50 crc kubenswrapper[4804]: I0217 14:24:50.573561 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9-kube-api-access-gwxbs" (OuterVolumeSpecName: "kube-api-access-gwxbs") pod "f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9" (UID: "f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9"). InnerVolumeSpecName "kube-api-access-gwxbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:24:50 crc kubenswrapper[4804]: I0217 14:24:50.619973 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9" (UID: "f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:24:50 crc kubenswrapper[4804]: I0217 14:24:50.668483 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:24:50 crc kubenswrapper[4804]: I0217 14:24:50.668529 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwxbs\" (UniqueName: \"kubernetes.io/projected/f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9-kube-api-access-gwxbs\") on node \"crc\" DevicePath \"\"" Feb 17 14:24:50 crc kubenswrapper[4804]: I0217 14:24:50.668545 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:24:50 crc kubenswrapper[4804]: I0217 14:24:50.695162 4804 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod812cc376-c0d4-45d6-9eb0-3500f3bb0ac1"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod812cc376-c0d4-45d6-9eb0-3500f3bb0ac1] : Timed out while waiting for systemd to remove kubepods-besteffort-pod812cc376_c0d4_45d6_9eb0_3500f3bb0ac1.slice" Feb 17 14:24:50 crc kubenswrapper[4804]: I0217 14:24:50.893438 4804 generic.go:334] "Generic (PLEG): container finished" podID="f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9" containerID="f6de235db8c65c1e1f660597bf3cdda3eea0fa509f4d751818543e3e71cd9431" exitCode=0 Feb 17 14:24:50 crc kubenswrapper[4804]: I0217 14:24:50.893477 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fbh6b" Feb 17 14:24:50 crc kubenswrapper[4804]: I0217 14:24:50.893492 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fbh6b" event={"ID":"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9","Type":"ContainerDied","Data":"f6de235db8c65c1e1f660597bf3cdda3eea0fa509f4d751818543e3e71cd9431"} Feb 17 14:24:50 crc kubenswrapper[4804]: I0217 14:24:50.893526 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fbh6b" event={"ID":"f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9","Type":"ContainerDied","Data":"e476be034a124bd37543c8e10bdc0da87c6541a84b1b15edb997e8e42215c690"} Feb 17 14:24:50 crc kubenswrapper[4804]: I0217 14:24:50.893556 4804 scope.go:117] "RemoveContainer" containerID="f6de235db8c65c1e1f660597bf3cdda3eea0fa509f4d751818543e3e71cd9431" Feb 17 14:24:50 crc kubenswrapper[4804]: I0217 14:24:50.935336 4804 scope.go:117] "RemoveContainer" containerID="da88800c0e4a215feb5506ae2c2cc2a511a3c63fb8b3fde64213461f864029df" Feb 17 14:24:50 crc kubenswrapper[4804]: I0217 14:24:50.966759 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fbh6b"] Feb 17 14:24:50 crc kubenswrapper[4804]: I0217 14:24:50.979448 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fbh6b"] Feb 17 14:24:51 crc kubenswrapper[4804]: I0217 14:24:51.025439 4804 scope.go:117] "RemoveContainer" containerID="0ee7a09a9548612157cd0499cb44d44f1bc48d1f96bb83d6a4a27122857dc493" Feb 17 14:24:51 crc kubenswrapper[4804]: I0217 14:24:51.048395 4804 scope.go:117] "RemoveContainer" containerID="f6de235db8c65c1e1f660597bf3cdda3eea0fa509f4d751818543e3e71cd9431" Feb 17 14:24:51 crc kubenswrapper[4804]: E0217 14:24:51.051407 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6de235db8c65c1e1f660597bf3cdda3eea0fa509f4d751818543e3e71cd9431\": container with ID starting with f6de235db8c65c1e1f660597bf3cdda3eea0fa509f4d751818543e3e71cd9431 not found: ID does not exist" containerID="f6de235db8c65c1e1f660597bf3cdda3eea0fa509f4d751818543e3e71cd9431" Feb 17 14:24:51 crc kubenswrapper[4804]: I0217 14:24:51.051463 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6de235db8c65c1e1f660597bf3cdda3eea0fa509f4d751818543e3e71cd9431"} err="failed to get container status \"f6de235db8c65c1e1f660597bf3cdda3eea0fa509f4d751818543e3e71cd9431\": rpc error: code = NotFound desc = could not find container \"f6de235db8c65c1e1f660597bf3cdda3eea0fa509f4d751818543e3e71cd9431\": container with ID starting with f6de235db8c65c1e1f660597bf3cdda3eea0fa509f4d751818543e3e71cd9431 not found: ID does not exist" Feb 17 14:24:51 crc kubenswrapper[4804]: I0217 14:24:51.051512 4804 scope.go:117] "RemoveContainer" containerID="da88800c0e4a215feb5506ae2c2cc2a511a3c63fb8b3fde64213461f864029df" Feb 17 14:24:51 crc kubenswrapper[4804]: E0217 14:24:51.052718 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da88800c0e4a215feb5506ae2c2cc2a511a3c63fb8b3fde64213461f864029df\": container with ID starting with da88800c0e4a215feb5506ae2c2cc2a511a3c63fb8b3fde64213461f864029df not found: ID does not exist" containerID="da88800c0e4a215feb5506ae2c2cc2a511a3c63fb8b3fde64213461f864029df" Feb 17 14:24:51 crc kubenswrapper[4804]: I0217 14:24:51.052757 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da88800c0e4a215feb5506ae2c2cc2a511a3c63fb8b3fde64213461f864029df"} err="failed to get container status \"da88800c0e4a215feb5506ae2c2cc2a511a3c63fb8b3fde64213461f864029df\": rpc error: code = NotFound desc = could not find container \"da88800c0e4a215feb5506ae2c2cc2a511a3c63fb8b3fde64213461f864029df\": container with ID starting with da88800c0e4a215feb5506ae2c2cc2a511a3c63fb8b3fde64213461f864029df not found: ID does not exist" Feb 17 14:24:51 crc kubenswrapper[4804]: I0217 14:24:51.052785 4804 scope.go:117] "RemoveContainer" containerID="0ee7a09a9548612157cd0499cb44d44f1bc48d1f96bb83d6a4a27122857dc493" Feb 17 14:24:51 crc kubenswrapper[4804]: E0217 14:24:51.053604 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ee7a09a9548612157cd0499cb44d44f1bc48d1f96bb83d6a4a27122857dc493\": container with ID starting with 0ee7a09a9548612157cd0499cb44d44f1bc48d1f96bb83d6a4a27122857dc493 not found: ID does not exist" containerID="0ee7a09a9548612157cd0499cb44d44f1bc48d1f96bb83d6a4a27122857dc493" Feb 17 14:24:51 crc kubenswrapper[4804]: I0217 14:24:51.053663 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ee7a09a9548612157cd0499cb44d44f1bc48d1f96bb83d6a4a27122857dc493"} err="failed to get container status \"0ee7a09a9548612157cd0499cb44d44f1bc48d1f96bb83d6a4a27122857dc493\": rpc error: code = NotFound desc = could not find container \"0ee7a09a9548612157cd0499cb44d44f1bc48d1f96bb83d6a4a27122857dc493\": container with ID starting with 0ee7a09a9548612157cd0499cb44d44f1bc48d1f96bb83d6a4a27122857dc493 not found: ID does not exist" Feb 17 14:24:52 crc kubenswrapper[4804]: I0217 14:24:52.585524 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9" path="/var/lib/kubelet/pods/f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9/volumes" Feb 17 14:24:55 crc kubenswrapper[4804]: I0217 14:24:55.835260 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:24:55 crc kubenswrapper[4804]: I0217 14:24:55.835587 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:24:58 crc kubenswrapper[4804]: I0217 14:24:58.601283 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_f5ef96d0-19a6-4561-bde2-cf38e0280b39/memcached/0.log" Feb 17 14:25:14 crc kubenswrapper[4804]: I0217 14:25:14.048666 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq_fc2739bc-c729-4c9f-856b-9a08143fc359/util/0.log" Feb 17 14:25:14 crc kubenswrapper[4804]: I0217 14:25:14.251022 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq_fc2739bc-c729-4c9f-856b-9a08143fc359/pull/0.log" Feb 17 14:25:14 crc kubenswrapper[4804]: I0217 14:25:14.253804 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq_fc2739bc-c729-4c9f-856b-9a08143fc359/util/0.log" Feb 17 14:25:14 crc kubenswrapper[4804]: I0217 14:25:14.273854 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq_fc2739bc-c729-4c9f-856b-9a08143fc359/pull/0.log" Feb 17 14:25:14 crc kubenswrapper[4804]: I0217 14:25:14.448598 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq_fc2739bc-c729-4c9f-856b-9a08143fc359/util/0.log" Feb 17 14:25:14 crc kubenswrapper[4804]: I0217 14:25:14.452722 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq_fc2739bc-c729-4c9f-856b-9a08143fc359/extract/0.log" Feb 17 14:25:14 crc kubenswrapper[4804]: I0217 14:25:14.487896 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq_fc2739bc-c729-4c9f-856b-9a08143fc359/pull/0.log" Feb 17 14:25:15 crc kubenswrapper[4804]: I0217 14:25:15.072774 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-55cc45767f-bslfv_fbc5e6cd-47c6-4199-a0f2-e4292a836fac/manager/0.log" Feb 17 14:25:15 crc kubenswrapper[4804]: I0217 14:25:15.382684 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-68c6d499cb-vt6zw_5796dc62-bd84-48b7-9c4c-7d5bf1f7e984/manager/0.log" Feb 17 14:25:15 crc kubenswrapper[4804]: I0217 14:25:15.577281 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-9595d6797-sxtr2_5727ae12-4720-4470-b5cc-8b8ae81c2af7/manager/0.log" Feb 17 14:25:15 crc kubenswrapper[4804]: I0217 14:25:15.818618 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-54fb488b88-t6hlr_5fa66dc5-a518-40dd-a4b5-dd2b34425ad5/manager/0.log" Feb 17 14:25:15 crc kubenswrapper[4804]: I0217 14:25:15.996844 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-57746b5ff9-wn64m_0b746a42-c0b4-4cb9-9352-3623669bad5a/manager/0.log" Feb 17 14:25:16 crc kubenswrapper[4804]: I0217 14:25:16.186152 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6494cdbf8f-cdpkr_07b97973-fa08-4b79-9164-918a4d04f8b7/manager/0.log" Feb 17 14:25:16 crc kubenswrapper[4804]: I0217 14:25:16.358474 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-66d6b5f488-lrjgg_bf13099a-fbab-41bf-b30c-5c6b1049af19/manager/0.log" Feb 17 14:25:16 crc kubenswrapper[4804]: I0217 14:25:16.437083 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-6c78d668d5-pddsh_430279ab-ba2f-4838-ab07-b851d4df84a0/manager/0.log" Feb 17 14:25:16 crc kubenswrapper[4804]: I0217 14:25:16.552374 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-96fff9cb8-88sh4_d3332002-6930-418f-8288-e8344be70c6a/manager/0.log" Feb 17 14:25:16 crc kubenswrapper[4804]: I0217 14:25:16.686143 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-66997756f6-vkdg2_2546387a-6a42-4f8d-a321-2f9cbaa11adb/manager/0.log" Feb 17 14:25:16 crc kubenswrapper[4804]: I0217 14:25:16.953501 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54967dbbdf-l5cl2_97925efc-eb46-4a60-b372-b31f13a2c876/manager/0.log" Feb 17 14:25:17 crc kubenswrapper[4804]: I0217 14:25:17.101273 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5ddd85db87-c8hmm_36b1ca46-becb-417e-b05e-777d40246cb6/manager/0.log" Feb 17 14:25:17 crc kubenswrapper[4804]: I0217 14:25:17.417592 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88_ae7598b8-fff5-4044-bbd7-0c8f2f60eed8/manager/0.log" Feb 17 14:25:17 crc kubenswrapper[4804]: I0217 14:25:17.780247 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-7cb8c4979f-kfx9x_f69fc148-3a8b-4065-b075-85ecad8339e7/operator/0.log" Feb 17 14:25:18 crc kubenswrapper[4804]: I0217 14:25:18.205548 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-55nc6_13d9e436-3cb0-4df0-aaf9-e614eba74c89/registry-server/0.log" Feb 17 14:25:18 crc kubenswrapper[4804]: I0217 14:25:18.570737 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-85c99d655-ltwrc_ac1e20c8-4527-4bba-85bd-2154e1244d3e/manager/0.log" Feb 17 14:25:18 crc kubenswrapper[4804]: I0217 14:25:18.651973 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-745bbbd77b-ptrs5_79eb8fb0-6207-44c8-b3c2-a00116bcf10b/manager/0.log" Feb 17 14:25:18 crc kubenswrapper[4804]: I0217 14:25:18.779831 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-57bd55f9b7-9vbg5_42505b9c-f878-4feb-b9a1-9dfa11ec0f56/manager/0.log" Feb 17 14:25:18 crc kubenswrapper[4804]: I0217 14:25:18.847655 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-rtlpm_44ec973d-9403-48f4-b92c-72f0bd485b0f/operator/0.log" Feb 17 14:25:19 crc kubenswrapper[4804]: I0217 14:25:19.061224 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-79558bbfbf-n6fl9_f94e791f-16fd-4364-a246-35bcca0d14e6/manager/0.log" Feb 17 14:25:19 crc kubenswrapper[4804]: I0217 14:25:19.296518 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-8467ccb4c8-nwmk5_1c7ad838-6225-4001-899a-7f741cb75f2f/manager/0.log" Feb 17 14:25:19 crc kubenswrapper[4804]: I0217 14:25:19.325394 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-56dc67d744-rbrxl_067b67c8-64c5-4c21-b1b1-770aa68e0eb7/manager/0.log" Feb 17 14:25:19 crc kubenswrapper[4804]: I0217 14:25:19.494836 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c469bc6bb-xlwmb_57038414-fcca-4a2a-8756-46f97cc57d81/manager/0.log" Feb 17 14:25:19 crc kubenswrapper[4804]: I0217 14:25:19.719489 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5744df64c-mkkrv_8155784a-3945-4ca3-aa9a-b0e089ffac52/manager/0.log" Feb 17 14:25:21 crc kubenswrapper[4804]: I0217 14:25:21.519062 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-c4b7d6946-4xvfg_545c7d25-7774-4c62-89b8-f491fd4065e8/manager/0.log" Feb 17 14:25:25 crc kubenswrapper[4804]: I0217 14:25:25.835328 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:25:25 crc kubenswrapper[4804]: I0217 14:25:25.835942 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:25:39 crc kubenswrapper[4804]: I0217 14:25:39.814429 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-t4m4g_6c98dfab-f166-4eb4-b385-724d6b9b9d7a/control-plane-machine-set-operator/0.log" Feb 17 14:25:40 crc kubenswrapper[4804]: I0217 14:25:40.076831 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-spfls_17c8a131-fc0e-44b5-b374-846e6b2aeb1c/kube-rbac-proxy/0.log" Feb 17 14:25:40 crc kubenswrapper[4804]: I0217 14:25:40.088031 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-spfls_17c8a131-fc0e-44b5-b374-846e6b2aeb1c/machine-api-operator/0.log" Feb 17 14:25:53 crc kubenswrapper[4804]: I0217 14:25:53.041105 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-7sfkb_112c357f-f1dc-4a07-bba0-ddf54ab071ff/cert-manager-controller/0.log" Feb 17 14:25:53 crc kubenswrapper[4804]: I0217 14:25:53.244580 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-kbdz5_9d2d8008-6348-4f24-8085-d30db8558ab3/cert-manager-cainjector/0.log" Feb 17 14:25:53 crc kubenswrapper[4804]: I0217 14:25:53.311572 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-c8nh8_be70f757-4537-489d-a86e-a1b49fc9af75/cert-manager-webhook/0.log" Feb 17 14:25:55 crc kubenswrapper[4804]: I0217 14:25:55.834923 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:25:55 crc kubenswrapper[4804]: I0217 14:25:55.835272 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:25:55 crc kubenswrapper[4804]: I0217 14:25:55.835322 4804 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 14:25:55 crc kubenswrapper[4804]: I0217 14:25:55.836126 4804 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7d4a85eb120e127ec0b1aabea32973bfd4724fea056face9a5b718c636d4f49a"} pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 14:25:55 crc kubenswrapper[4804]: I0217 14:25:55.836190 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" containerID="cri-o://7d4a85eb120e127ec0b1aabea32973bfd4724fea056face9a5b718c636d4f49a" gracePeriod=600 Feb 17 14:25:56 crc kubenswrapper[4804]: I0217 14:25:56.475172 4804 generic.go:334] "Generic (PLEG): container finished" podID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerID="7d4a85eb120e127ec0b1aabea32973bfd4724fea056face9a5b718c636d4f49a" exitCode=0 Feb 17 14:25:56 crc kubenswrapper[4804]: I0217 14:25:56.475232 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerDied","Data":"7d4a85eb120e127ec0b1aabea32973bfd4724fea056face9a5b718c636d4f49a"} Feb 17 14:25:56 crc kubenswrapper[4804]: I0217 14:25:56.475580 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerStarted","Data":"f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db"} Feb 17 14:25:56 crc kubenswrapper[4804]: I0217 14:25:56.475608 4804 scope.go:117] "RemoveContainer" containerID="cdbc2cae81b3e82d1fce36b183650f6618c42148ac6fbf7d376afb0576a16cce" Feb 17 14:26:02 crc kubenswrapper[4804]: I0217 14:26:02.978468 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k4q2q"] Feb 17 14:26:02 crc kubenswrapper[4804]: E0217 14:26:02.979362 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9" containerName="extract-content" Feb 17 14:26:02 crc kubenswrapper[4804]: I0217 14:26:02.979377 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9" containerName="extract-content" Feb 17 14:26:02 crc kubenswrapper[4804]: E0217 14:26:02.979394 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9" containerName="extract-utilities" Feb 17 14:26:02 crc kubenswrapper[4804]: I0217 14:26:02.979403 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9" containerName="extract-utilities" Feb 17 14:26:02 crc kubenswrapper[4804]: E0217 14:26:02.979428 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9" containerName="registry-server" Feb 17 14:26:02 crc kubenswrapper[4804]: I0217 14:26:02.979437 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9" containerName="registry-server" Feb 17 14:26:02 crc kubenswrapper[4804]: I0217 14:26:02.979690 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="f66ad65b-7a2a-47a8-a4ed-512f4d6ecee9" containerName="registry-server" Feb 17 14:26:02 crc kubenswrapper[4804]: I0217 14:26:02.981271 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k4q2q" Feb 17 14:26:03 crc kubenswrapper[4804]: I0217 14:26:03.006066 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k4q2q"] Feb 17 14:26:03 crc kubenswrapper[4804]: I0217 14:26:03.038954 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f377127-aca7-4b36-976b-fdc21aadd31b-catalog-content\") pod \"certified-operators-k4q2q\" (UID: \"6f377127-aca7-4b36-976b-fdc21aadd31b\") " pod="openshift-marketplace/certified-operators-k4q2q" Feb 17 14:26:03 crc kubenswrapper[4804]: I0217 14:26:03.039064 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x2mh\" (UniqueName: \"kubernetes.io/projected/6f377127-aca7-4b36-976b-fdc21aadd31b-kube-api-access-5x2mh\") pod \"certified-operators-k4q2q\" (UID: \"6f377127-aca7-4b36-976b-fdc21aadd31b\") " pod="openshift-marketplace/certified-operators-k4q2q" Feb 17 14:26:03 crc kubenswrapper[4804]: I0217 14:26:03.039116 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f377127-aca7-4b36-976b-fdc21aadd31b-utilities\") pod \"certified-operators-k4q2q\" (UID: \"6f377127-aca7-4b36-976b-fdc21aadd31b\") " pod="openshift-marketplace/certified-operators-k4q2q" Feb 17 14:26:03 crc kubenswrapper[4804]: I0217 14:26:03.141086 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f377127-aca7-4b36-976b-fdc21aadd31b-catalog-content\") pod \"certified-operators-k4q2q\" (UID: \"6f377127-aca7-4b36-976b-fdc21aadd31b\") " pod="openshift-marketplace/certified-operators-k4q2q" Feb 17 14:26:03 crc kubenswrapper[4804]: I0217 14:26:03.141213 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x2mh\" (UniqueName: \"kubernetes.io/projected/6f377127-aca7-4b36-976b-fdc21aadd31b-kube-api-access-5x2mh\") pod \"certified-operators-k4q2q\" (UID: \"6f377127-aca7-4b36-976b-fdc21aadd31b\") " pod="openshift-marketplace/certified-operators-k4q2q" Feb 17 14:26:03 crc kubenswrapper[4804]: I0217 14:26:03.141280 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f377127-aca7-4b36-976b-fdc21aadd31b-utilities\") pod \"certified-operators-k4q2q\" (UID: \"6f377127-aca7-4b36-976b-fdc21aadd31b\") " pod="openshift-marketplace/certified-operators-k4q2q" Feb 17 14:26:03 crc kubenswrapper[4804]: I0217 14:26:03.142158 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f377127-aca7-4b36-976b-fdc21aadd31b-catalog-content\") pod \"certified-operators-k4q2q\" (UID: \"6f377127-aca7-4b36-976b-fdc21aadd31b\") " pod="openshift-marketplace/certified-operators-k4q2q" Feb 17 14:26:03 crc kubenswrapper[4804]: I0217 14:26:03.142485 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f377127-aca7-4b36-976b-fdc21aadd31b-utilities\") pod \"certified-operators-k4q2q\" (UID: \"6f377127-aca7-4b36-976b-fdc21aadd31b\") " pod="openshift-marketplace/certified-operators-k4q2q" Feb 17 14:26:03 crc kubenswrapper[4804]: I0217 14:26:03.171335 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x2mh\" (UniqueName: \"kubernetes.io/projected/6f377127-aca7-4b36-976b-fdc21aadd31b-kube-api-access-5x2mh\") pod \"certified-operators-k4q2q\" (UID: \"6f377127-aca7-4b36-976b-fdc21aadd31b\") " pod="openshift-marketplace/certified-operators-k4q2q" Feb 17 14:26:03 crc kubenswrapper[4804]: I0217 14:26:03.304435 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k4q2q" Feb 17 14:26:03 crc kubenswrapper[4804]: I0217 14:26:03.874179 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k4q2q"] Feb 17 14:26:04 crc kubenswrapper[4804]: I0217 14:26:04.545967 4804 generic.go:334] "Generic (PLEG): container finished" podID="6f377127-aca7-4b36-976b-fdc21aadd31b" containerID="859966cd4b303ac29a91909a91abfa5313c62f12c4f835189bddd275801bedc7" exitCode=0 Feb 17 14:26:04 crc kubenswrapper[4804]: I0217 14:26:04.546020 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4q2q" event={"ID":"6f377127-aca7-4b36-976b-fdc21aadd31b","Type":"ContainerDied","Data":"859966cd4b303ac29a91909a91abfa5313c62f12c4f835189bddd275801bedc7"} Feb 17 14:26:04 crc kubenswrapper[4804]: I0217 14:26:04.546309 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4q2q" event={"ID":"6f377127-aca7-4b36-976b-fdc21aadd31b","Type":"ContainerStarted","Data":"588e0c01fb6efe70994ca955b961363bfa392c6732aced2b41d8b00a42135f3e"} Feb 17 14:26:05 crc kubenswrapper[4804]: I0217 14:26:05.558237 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4q2q" event={"ID":"6f377127-aca7-4b36-976b-fdc21aadd31b","Type":"ContainerStarted","Data":"5a721da4dcd3ad2de4531e775aa38bba4884a187c6ee9269362fa2fff56bfc7c"} Feb 17 14:26:06 crc kubenswrapper[4804]: I0217 14:26:06.569337 4804 generic.go:334] "Generic (PLEG): container finished" podID="6f377127-aca7-4b36-976b-fdc21aadd31b" containerID="5a721da4dcd3ad2de4531e775aa38bba4884a187c6ee9269362fa2fff56bfc7c" exitCode=0 Feb 17 14:26:06 crc kubenswrapper[4804]: I0217 14:26:06.569406 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4q2q" event={"ID":"6f377127-aca7-4b36-976b-fdc21aadd31b","Type":"ContainerDied","Data":"5a721da4dcd3ad2de4531e775aa38bba4884a187c6ee9269362fa2fff56bfc7c"} Feb 17 14:26:06 crc kubenswrapper[4804]: I0217 14:26:06.569902 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4q2q" event={"ID":"6f377127-aca7-4b36-976b-fdc21aadd31b","Type":"ContainerStarted","Data":"e26f4086f54f34860838877f49284539398d479ec4b790a9847d73954aa953bf"} Feb 17 14:26:06 crc kubenswrapper[4804]: I0217 14:26:06.597741 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k4q2q" podStartSLOduration=3.15384256 podStartE2EDuration="4.597717558s" podCreationTimestamp="2026-02-17 14:26:02 +0000 UTC" firstStartedPulling="2026-02-17 14:26:04.549356045 +0000 UTC m=+3638.660775392" lastFinishedPulling="2026-02-17 14:26:05.993231053 +0000 UTC m=+3640.104650390" observedRunningTime="2026-02-17 14:26:06.593470854 +0000 UTC m=+3640.704890201" watchObservedRunningTime="2026-02-17 14:26:06.597717558 +0000 UTC m=+3640.709136895" Feb 17 14:26:06 crc kubenswrapper[4804]: I0217 14:26:06.940370 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-bgf7w_2158c202-5aa4-47aa-87a1-73e4b9043e78/nmstate-console-plugin/0.log" Feb 17 14:26:07 crc kubenswrapper[4804]: I0217 14:26:07.194237 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-8gkbz_18e3c061-8633-471f-b2ab-e87e3c0b5d44/kube-rbac-proxy/0.log" Feb 17 14:26:07 crc kubenswrapper[4804]: I0217 14:26:07.241152 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-jxn7r_81e46a71-360c-4509-ad38-2b2c814a56c2/nmstate-handler/0.log" Feb 17 14:26:07 crc kubenswrapper[4804]: I0217 14:26:07.270894 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-8gkbz_18e3c061-8633-471f-b2ab-e87e3c0b5d44/nmstate-metrics/0.log" Feb 17 14:26:07 crc kubenswrapper[4804]: I0217 14:26:07.399265 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-rkf7s_2789dcb9-5619-4986-a692-1eec733c97ff/nmstate-operator/0.log" Feb 17 14:26:07 crc kubenswrapper[4804]: I0217 14:26:07.511942 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-dbfqz_36fd4ae3-048e-4e51-b2fa-875a5c84b8e0/nmstate-webhook/0.log" Feb 17 14:26:13 crc kubenswrapper[4804]: I0217 14:26:13.305411 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k4q2q" Feb 17 14:26:13 crc kubenswrapper[4804]: I0217 14:26:13.305931 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k4q2q" Feb 17 14:26:13 crc kubenswrapper[4804]: I0217 14:26:13.363305 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k4q2q" Feb 17 14:26:13 crc kubenswrapper[4804]: I0217 14:26:13.671895 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k4q2q" Feb 17 14:26:13 crc kubenswrapper[4804]: I0217 14:26:13.714960 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k4q2q"] Feb 17 14:26:15 crc kubenswrapper[4804]: I0217 14:26:15.641870 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-k4q2q" podUID="6f377127-aca7-4b36-976b-fdc21aadd31b" containerName="registry-server" containerID="cri-o://e26f4086f54f34860838877f49284539398d479ec4b790a9847d73954aa953bf" gracePeriod=2 Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.149412 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k4q2q" Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.303466 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f377127-aca7-4b36-976b-fdc21aadd31b-catalog-content\") pod \"6f377127-aca7-4b36-976b-fdc21aadd31b\" (UID: \"6f377127-aca7-4b36-976b-fdc21aadd31b\") " Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.303551 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f377127-aca7-4b36-976b-fdc21aadd31b-utilities\") pod \"6f377127-aca7-4b36-976b-fdc21aadd31b\" (UID: \"6f377127-aca7-4b36-976b-fdc21aadd31b\") " Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.303733 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5x2mh\" (UniqueName: \"kubernetes.io/projected/6f377127-aca7-4b36-976b-fdc21aadd31b-kube-api-access-5x2mh\") pod \"6f377127-aca7-4b36-976b-fdc21aadd31b\" (UID: \"6f377127-aca7-4b36-976b-fdc21aadd31b\") " Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.304889 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f377127-aca7-4b36-976b-fdc21aadd31b-utilities" (OuterVolumeSpecName: "utilities") pod "6f377127-aca7-4b36-976b-fdc21aadd31b" (UID: "6f377127-aca7-4b36-976b-fdc21aadd31b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.309578 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f377127-aca7-4b36-976b-fdc21aadd31b-kube-api-access-5x2mh" (OuterVolumeSpecName: "kube-api-access-5x2mh") pod "6f377127-aca7-4b36-976b-fdc21aadd31b" (UID: "6f377127-aca7-4b36-976b-fdc21aadd31b"). InnerVolumeSpecName "kube-api-access-5x2mh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.363732 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f377127-aca7-4b36-976b-fdc21aadd31b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f377127-aca7-4b36-976b-fdc21aadd31b" (UID: "6f377127-aca7-4b36-976b-fdc21aadd31b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.406243 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f377127-aca7-4b36-976b-fdc21aadd31b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.406288 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f377127-aca7-4b36-976b-fdc21aadd31b-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.406301 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5x2mh\" (UniqueName: \"kubernetes.io/projected/6f377127-aca7-4b36-976b-fdc21aadd31b-kube-api-access-5x2mh\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.652170 4804 generic.go:334] "Generic (PLEG): container finished" podID="6f377127-aca7-4b36-976b-fdc21aadd31b" containerID="e26f4086f54f34860838877f49284539398d479ec4b790a9847d73954aa953bf" exitCode=0 Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.652239 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k4q2q" Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.652249 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4q2q" event={"ID":"6f377127-aca7-4b36-976b-fdc21aadd31b","Type":"ContainerDied","Data":"e26f4086f54f34860838877f49284539398d479ec4b790a9847d73954aa953bf"} Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.653372 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4q2q" event={"ID":"6f377127-aca7-4b36-976b-fdc21aadd31b","Type":"ContainerDied","Data":"588e0c01fb6efe70994ca955b961363bfa392c6732aced2b41d8b00a42135f3e"} Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.653390 4804 scope.go:117] "RemoveContainer" containerID="e26f4086f54f34860838877f49284539398d479ec4b790a9847d73954aa953bf" Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.680300 4804 scope.go:117] "RemoveContainer" containerID="5a721da4dcd3ad2de4531e775aa38bba4884a187c6ee9269362fa2fff56bfc7c" Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.684237 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k4q2q"] Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.695242 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-k4q2q"] Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.707909 4804 scope.go:117] "RemoveContainer" containerID="859966cd4b303ac29a91909a91abfa5313c62f12c4f835189bddd275801bedc7" Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.762603 4804 scope.go:117] "RemoveContainer" containerID="e26f4086f54f34860838877f49284539398d479ec4b790a9847d73954aa953bf" Feb 17 14:26:16 crc kubenswrapper[4804]: E0217 14:26:16.763101 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e26f4086f54f34860838877f49284539398d479ec4b790a9847d73954aa953bf\": container with ID starting with e26f4086f54f34860838877f49284539398d479ec4b790a9847d73954aa953bf not found: ID does not exist" containerID="e26f4086f54f34860838877f49284539398d479ec4b790a9847d73954aa953bf" Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.763168 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e26f4086f54f34860838877f49284539398d479ec4b790a9847d73954aa953bf"} err="failed to get container status \"e26f4086f54f34860838877f49284539398d479ec4b790a9847d73954aa953bf\": rpc error: code = NotFound desc = could not find container \"e26f4086f54f34860838877f49284539398d479ec4b790a9847d73954aa953bf\": container with ID starting with e26f4086f54f34860838877f49284539398d479ec4b790a9847d73954aa953bf not found: ID does not exist" Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.763227 4804 scope.go:117] "RemoveContainer" containerID="5a721da4dcd3ad2de4531e775aa38bba4884a187c6ee9269362fa2fff56bfc7c" Feb 17 14:26:16 crc kubenswrapper[4804]: E0217 14:26:16.763556 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a721da4dcd3ad2de4531e775aa38bba4884a187c6ee9269362fa2fff56bfc7c\": container with ID starting with 5a721da4dcd3ad2de4531e775aa38bba4884a187c6ee9269362fa2fff56bfc7c not found: ID does not exist" containerID="5a721da4dcd3ad2de4531e775aa38bba4884a187c6ee9269362fa2fff56bfc7c" Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.763596 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a721da4dcd3ad2de4531e775aa38bba4884a187c6ee9269362fa2fff56bfc7c"} err="failed to get container status \"5a721da4dcd3ad2de4531e775aa38bba4884a187c6ee9269362fa2fff56bfc7c\": rpc error: code = NotFound desc = could not find container \"5a721da4dcd3ad2de4531e775aa38bba4884a187c6ee9269362fa2fff56bfc7c\": container with ID starting with 5a721da4dcd3ad2de4531e775aa38bba4884a187c6ee9269362fa2fff56bfc7c not found: ID does not exist" Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.763623 4804 scope.go:117] "RemoveContainer" containerID="859966cd4b303ac29a91909a91abfa5313c62f12c4f835189bddd275801bedc7" Feb 17 14:26:16 crc kubenswrapper[4804]: E0217 14:26:16.763913 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"859966cd4b303ac29a91909a91abfa5313c62f12c4f835189bddd275801bedc7\": container with ID starting with 859966cd4b303ac29a91909a91abfa5313c62f12c4f835189bddd275801bedc7 not found: ID does not exist" containerID="859966cd4b303ac29a91909a91abfa5313c62f12c4f835189bddd275801bedc7" Feb 17 14:26:16 crc kubenswrapper[4804]: I0217 14:26:16.763948 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"859966cd4b303ac29a91909a91abfa5313c62f12c4f835189bddd275801bedc7"} err="failed to get container status \"859966cd4b303ac29a91909a91abfa5313c62f12c4f835189bddd275801bedc7\": rpc error: code = NotFound desc = could not find container \"859966cd4b303ac29a91909a91abfa5313c62f12c4f835189bddd275801bedc7\": container with ID starting with 859966cd4b303ac29a91909a91abfa5313c62f12c4f835189bddd275801bedc7 not found: ID does not exist" Feb 17 14:26:18 crc kubenswrapper[4804]: I0217 14:26:18.585260 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f377127-aca7-4b36-976b-fdc21aadd31b" path="/var/lib/kubelet/pods/6f377127-aca7-4b36-976b-fdc21aadd31b/volumes" Feb 17 14:26:33 crc kubenswrapper[4804]: I0217 14:26:33.000546 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-wg4pd_01625c42-e1b1-470d-b705-47b30fec457a/kube-rbac-proxy/0.log" Feb 17 14:26:33 crc kubenswrapper[4804]: I0217 14:26:33.160469 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-wg4pd_01625c42-e1b1-470d-b705-47b30fec457a/controller/0.log" Feb 17 14:26:33 crc kubenswrapper[4804]: I0217 14:26:33.281594 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-frr-files/0.log" Feb 17 14:26:33 crc kubenswrapper[4804]: I0217 14:26:33.472663 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-reloader/0.log" Feb 17 14:26:33 crc kubenswrapper[4804]: I0217 14:26:33.475501 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-frr-files/0.log" Feb 17 14:26:33 crc kubenswrapper[4804]: I0217 14:26:33.476855 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-reloader/0.log" Feb 17 14:26:33 crc kubenswrapper[4804]: I0217 14:26:33.509131 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-metrics/0.log" Feb 17 14:26:33 crc kubenswrapper[4804]: I0217 14:26:33.688861 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-frr-files/0.log" Feb 17 14:26:33 crc kubenswrapper[4804]: I0217 14:26:33.692470 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-metrics/0.log" Feb 17 14:26:33 crc kubenswrapper[4804]: I0217 14:26:33.703122 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-reloader/0.log" Feb 17 14:26:33 crc kubenswrapper[4804]: I0217 14:26:33.784338 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-metrics/0.log" Feb 17 14:26:33 crc kubenswrapper[4804]: I0217 14:26:33.913956 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-reloader/0.log" Feb 17 14:26:33 crc kubenswrapper[4804]: I0217 14:26:33.949318 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-frr-files/0.log" Feb 17 14:26:33 crc kubenswrapper[4804]: I0217 14:26:33.959961 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-metrics/0.log" Feb 17 14:26:34 crc kubenswrapper[4804]: I0217 14:26:34.008016 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/controller/0.log" Feb 17 14:26:34 crc kubenswrapper[4804]: I0217 14:26:34.172671 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/frr-metrics/0.log" Feb 17 14:26:34 crc kubenswrapper[4804]: I0217 14:26:34.189541 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/kube-rbac-proxy/0.log" Feb 17 14:26:34 crc kubenswrapper[4804]: I0217 14:26:34.214087 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/kube-rbac-proxy-frr/0.log" Feb 17 14:26:34 crc kubenswrapper[4804]: I0217 14:26:34.416802 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/reloader/0.log" Feb 17 14:26:34 crc kubenswrapper[4804]: I0217 14:26:34.420168 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-gl8tp_0d003d1c-2370-4291-a035-0ebe8b97cfee/frr-k8s-webhook-server/0.log" Feb 17 14:26:34 crc kubenswrapper[4804]: I0217 14:26:34.652870 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-c7c468df9-kbjlb_c17333d4-cfc6-4129-af9e-a8f2db54988b/manager/0.log" Feb 17 14:26:34 crc kubenswrapper[4804]: I0217 14:26:34.846100 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wrsrf_ef60181c-19a6-454c-a197-2b0af0ac2edf/kube-rbac-proxy/0.log" Feb 17 14:26:34 crc kubenswrapper[4804]: I0217 14:26:34.894585 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-996ff79d9-vm8dt_82716046-7f15-43d7-b9de-8fdb68a44c0b/webhook-server/0.log" Feb 17 14:26:35 crc kubenswrapper[4804]: I0217 14:26:35.566489 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wrsrf_ef60181c-19a6-454c-a197-2b0af0ac2edf/speaker/0.log" Feb 17 14:26:35 crc kubenswrapper[4804]: I0217 14:26:35.634075 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/frr/0.log" Feb 17 14:26:48 crc kubenswrapper[4804]: I0217 14:26:48.349856 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h_7e8c98d2-433f-46f9-a2f3-3a368c1b2608/util/0.log" Feb 17 14:26:48 crc kubenswrapper[4804]: I0217 14:26:48.791787 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h_7e8c98d2-433f-46f9-a2f3-3a368c1b2608/pull/0.log" Feb 17 14:26:48 crc kubenswrapper[4804]: I0217 14:26:48.823007 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h_7e8c98d2-433f-46f9-a2f3-3a368c1b2608/util/0.log" Feb 17 14:26:48 crc kubenswrapper[4804]: I0217 14:26:48.824353 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h_7e8c98d2-433f-46f9-a2f3-3a368c1b2608/pull/0.log" Feb 17 14:26:48 crc kubenswrapper[4804]: I0217 14:26:48.949559 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h_7e8c98d2-433f-46f9-a2f3-3a368c1b2608/util/0.log" Feb 17 14:26:48 crc kubenswrapper[4804]: I0217 14:26:48.997113 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h_7e8c98d2-433f-46f9-a2f3-3a368c1b2608/pull/0.log" Feb 17 14:26:49 crc kubenswrapper[4804]: I0217 14:26:49.042803 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h_7e8c98d2-433f-46f9-a2f3-3a368c1b2608/extract/0.log" Feb 17 14:26:49 crc kubenswrapper[4804]: I0217 14:26:49.123030 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jhxhx_5816c991-ba5a-4d3c-9d69-d28846ca92f6/extract-utilities/0.log" Feb 17 14:26:49 crc kubenswrapper[4804]: I0217 14:26:49.338246 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jhxhx_5816c991-ba5a-4d3c-9d69-d28846ca92f6/extract-utilities/0.log" Feb 17 14:26:49 crc kubenswrapper[4804]: I0217 14:26:49.340103 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jhxhx_5816c991-ba5a-4d3c-9d69-d28846ca92f6/extract-content/0.log" Feb 17 14:26:49 crc kubenswrapper[4804]: I0217 14:26:49.381600 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jhxhx_5816c991-ba5a-4d3c-9d69-d28846ca92f6/extract-content/0.log" Feb 17 14:26:49 crc kubenswrapper[4804]: I0217 14:26:49.520456 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jhxhx_5816c991-ba5a-4d3c-9d69-d28846ca92f6/extract-utilities/0.log" Feb 17 14:26:49 crc kubenswrapper[4804]: I0217 14:26:49.566726 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jhxhx_5816c991-ba5a-4d3c-9d69-d28846ca92f6/extract-content/0.log" Feb 17 14:26:49 crc kubenswrapper[4804]: I0217 14:26:49.793119 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m2bjw_57d3429b-b2f5-49ea-94b2-b79aa1769367/extract-utilities/0.log" Feb 17 14:26:50 crc kubenswrapper[4804]: I0217 14:26:50.047776 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jhxhx_5816c991-ba5a-4d3c-9d69-d28846ca92f6/registry-server/0.log" Feb 17 14:26:50 crc kubenswrapper[4804]: I0217 14:26:50.055799 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m2bjw_57d3429b-b2f5-49ea-94b2-b79aa1769367/extract-content/0.log" Feb 17 14:26:50 crc kubenswrapper[4804]: I0217 14:26:50.071590 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m2bjw_57d3429b-b2f5-49ea-94b2-b79aa1769367/extract-utilities/0.log" Feb 17 14:26:50 crc kubenswrapper[4804]: I0217 14:26:50.085553 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m2bjw_57d3429b-b2f5-49ea-94b2-b79aa1769367/extract-content/0.log" Feb 17 14:26:50 crc kubenswrapper[4804]: I0217 14:26:50.234425 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m2bjw_57d3429b-b2f5-49ea-94b2-b79aa1769367/extract-content/0.log" Feb 17 14:26:50 crc kubenswrapper[4804]: I0217 14:26:50.261579 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m2bjw_57d3429b-b2f5-49ea-94b2-b79aa1769367/extract-utilities/0.log" Feb 17 14:26:50 crc kubenswrapper[4804]: I0217 14:26:50.508519 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d_17c12921-34cb-4c2e-9cb8-585348e46d30/util/0.log" Feb 17 14:26:50 crc kubenswrapper[4804]: I0217 14:26:50.658468 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d_17c12921-34cb-4c2e-9cb8-585348e46d30/util/0.log" Feb 17 14:26:50 crc kubenswrapper[4804]: I0217 14:26:50.697418 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d_17c12921-34cb-4c2e-9cb8-585348e46d30/pull/0.log" Feb 17 14:26:50 crc kubenswrapper[4804]: I0217 14:26:50.716473 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d_17c12921-34cb-4c2e-9cb8-585348e46d30/pull/0.log" Feb 17 14:26:50 crc kubenswrapper[4804]: I0217 14:26:50.946231 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d_17c12921-34cb-4c2e-9cb8-585348e46d30/util/0.log" Feb 17 14:26:51 crc kubenswrapper[4804]: I0217 14:26:51.020874 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d_17c12921-34cb-4c2e-9cb8-585348e46d30/extract/0.log" Feb 17 14:26:51 crc kubenswrapper[4804]: I0217 14:26:51.022108 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d_17c12921-34cb-4c2e-9cb8-585348e46d30/pull/0.log" Feb 17 14:26:51 crc kubenswrapper[4804]: I0217 14:26:51.063003 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m2bjw_57d3429b-b2f5-49ea-94b2-b79aa1769367/registry-server/0.log" Feb 17 14:26:51 crc kubenswrapper[4804]: I0217 14:26:51.292921 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5fs82_e7d80260-64fd-4975-a620-5c515a765fd3/extract-utilities/0.log" Feb 17 14:26:51 crc kubenswrapper[4804]: I0217 14:26:51.315956 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-26cwx_78a56ea9-6641-4d2d-8471-b40e5f2cf7e5/marketplace-operator/0.log" Feb 17 14:26:51 crc kubenswrapper[4804]: I0217 14:26:51.472328 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5fs82_e7d80260-64fd-4975-a620-5c515a765fd3/extract-utilities/0.log" Feb 17 14:26:51 crc kubenswrapper[4804]: I0217 14:26:51.484069 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5fs82_e7d80260-64fd-4975-a620-5c515a765fd3/extract-content/0.log" Feb 17 14:26:51 crc kubenswrapper[4804]: I0217 14:26:51.559177 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5fs82_e7d80260-64fd-4975-a620-5c515a765fd3/extract-content/0.log" Feb 17 14:26:51 crc kubenswrapper[4804]: I0217 14:26:51.737094 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5fs82_e7d80260-64fd-4975-a620-5c515a765fd3/extract-content/0.log" Feb 17 14:26:51 crc kubenswrapper[4804]: I0217 14:26:51.786745 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5fs82_e7d80260-64fd-4975-a620-5c515a765fd3/extract-utilities/0.log" Feb 17 14:26:51 crc kubenswrapper[4804]: I0217 14:26:51.913230 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5fs82_e7d80260-64fd-4975-a620-5c515a765fd3/registry-server/0.log" Feb 17 14:26:51 crc kubenswrapper[4804]: I0217 14:26:51.960085 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bhcxz_fdf90149-055d-48ca-9336-ca6d6545f8a3/extract-utilities/0.log" Feb 17 14:26:52 crc kubenswrapper[4804]: I0217 14:26:52.148422 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bhcxz_fdf90149-055d-48ca-9336-ca6d6545f8a3/extract-content/0.log" Feb 17 14:26:52 crc kubenswrapper[4804]: I0217 14:26:52.149451 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bhcxz_fdf90149-055d-48ca-9336-ca6d6545f8a3/extract-content/0.log" Feb 17 14:26:52 crc kubenswrapper[4804]: I0217 14:26:52.198153 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bhcxz_fdf90149-055d-48ca-9336-ca6d6545f8a3/extract-utilities/0.log" Feb 17 14:26:52 crc kubenswrapper[4804]: I0217 14:26:52.368749 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bhcxz_fdf90149-055d-48ca-9336-ca6d6545f8a3/extract-utilities/0.log" Feb 17 14:26:52 crc kubenswrapper[4804]: I0217 14:26:52.421929 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bhcxz_fdf90149-055d-48ca-9336-ca6d6545f8a3/extract-content/0.log" Feb 17 14:26:52 crc kubenswrapper[4804]: I0217 14:26:52.856121 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bhcxz_fdf90149-055d-48ca-9336-ca6d6545f8a3/registry-server/0.log" Feb 17 14:28:25 crc kubenswrapper[4804]: I0217 14:28:25.835044 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:28:25 crc kubenswrapper[4804]: I0217 14:28:25.835663 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:28:41 crc kubenswrapper[4804]: I0217 14:28:41.362085 4804 generic.go:334] "Generic (PLEG): container finished" podID="dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a" containerID="ba2eb522415e52153f760776df650e5d6f8b40d4bd8471402148ec4a34e5b336" exitCode=0 Feb 17 14:28:41 crc kubenswrapper[4804]: I0217 14:28:41.362218 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4k4qm/must-gather-49hd6" event={"ID":"dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a","Type":"ContainerDied","Data":"ba2eb522415e52153f760776df650e5d6f8b40d4bd8471402148ec4a34e5b336"} Feb 17 14:28:41 crc kubenswrapper[4804]: I0217 14:28:41.363574 4804 scope.go:117] "RemoveContainer" containerID="ba2eb522415e52153f760776df650e5d6f8b40d4bd8471402148ec4a34e5b336" Feb 17 14:28:41 crc kubenswrapper[4804]: I0217 14:28:41.917860 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4k4qm_must-gather-49hd6_dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a/gather/0.log" Feb 17 14:28:49 crc kubenswrapper[4804]: I0217 14:28:49.424949 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4k4qm/must-gather-49hd6"] Feb 17 14:28:49 crc kubenswrapper[4804]: I0217 14:28:49.425738 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-4k4qm/must-gather-49hd6" podUID="dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a" containerName="copy" containerID="cri-o://7785755ab9c2639f42f1f2d318c08fd3c381e3887af2585aac75256b147c0c8e" gracePeriod=2 Feb 17 14:28:49 crc kubenswrapper[4804]: I0217 14:28:49.436310 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4k4qm/must-gather-49hd6"] Feb 17 14:28:49 crc kubenswrapper[4804]: I0217 14:28:49.830739 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4k4qm_must-gather-49hd6_dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a/copy/0.log" Feb 17 14:28:49 crc kubenswrapper[4804]: I0217 14:28:49.831912 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4qm/must-gather-49hd6" Feb 17 14:28:50 crc kubenswrapper[4804]: I0217 14:28:50.040021 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlhtq\" (UniqueName: \"kubernetes.io/projected/dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a-kube-api-access-zlhtq\") pod \"dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a\" (UID: \"dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a\") " Feb 17 14:28:50 crc kubenswrapper[4804]: I0217 14:28:50.042893 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a-must-gather-output\") pod \"dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a\" (UID: \"dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a\") " Feb 17 14:28:50 crc kubenswrapper[4804]: I0217 14:28:50.056703 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a-kube-api-access-zlhtq" (OuterVolumeSpecName: "kube-api-access-zlhtq") pod "dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a" (UID: "dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a"). InnerVolumeSpecName "kube-api-access-zlhtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:28:50 crc kubenswrapper[4804]: I0217 14:28:50.147511 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlhtq\" (UniqueName: \"kubernetes.io/projected/dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a-kube-api-access-zlhtq\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:50 crc kubenswrapper[4804]: I0217 14:28:50.201054 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a" (UID: "dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:28:50 crc kubenswrapper[4804]: I0217 14:28:50.248804 4804 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:50 crc kubenswrapper[4804]: I0217 14:28:50.454609 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4k4qm_must-gather-49hd6_dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a/copy/0.log" Feb 17 14:28:50 crc kubenswrapper[4804]: I0217 14:28:50.454945 4804 generic.go:334] "Generic (PLEG): container finished" podID="dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a" containerID="7785755ab9c2639f42f1f2d318c08fd3c381e3887af2585aac75256b147c0c8e" exitCode=143 Feb 17 14:28:50 crc kubenswrapper[4804]: I0217 14:28:50.455003 4804 scope.go:117] "RemoveContainer" containerID="7785755ab9c2639f42f1f2d318c08fd3c381e3887af2585aac75256b147c0c8e" Feb 17 14:28:50 crc kubenswrapper[4804]: I0217 14:28:50.455183 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4k4qm/must-gather-49hd6" Feb 17 14:28:50 crc kubenswrapper[4804]: I0217 14:28:50.482717 4804 scope.go:117] "RemoveContainer" containerID="ba2eb522415e52153f760776df650e5d6f8b40d4bd8471402148ec4a34e5b336" Feb 17 14:28:50 crc kubenswrapper[4804]: I0217 14:28:50.530316 4804 scope.go:117] "RemoveContainer" containerID="7785755ab9c2639f42f1f2d318c08fd3c381e3887af2585aac75256b147c0c8e" Feb 17 14:28:50 crc kubenswrapper[4804]: E0217 14:28:50.534782 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7785755ab9c2639f42f1f2d318c08fd3c381e3887af2585aac75256b147c0c8e\": container with ID starting with 7785755ab9c2639f42f1f2d318c08fd3c381e3887af2585aac75256b147c0c8e not found: ID does not exist" containerID="7785755ab9c2639f42f1f2d318c08fd3c381e3887af2585aac75256b147c0c8e" Feb 17 14:28:50 crc kubenswrapper[4804]: I0217 14:28:50.535030 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7785755ab9c2639f42f1f2d318c08fd3c381e3887af2585aac75256b147c0c8e"} err="failed to get container status \"7785755ab9c2639f42f1f2d318c08fd3c381e3887af2585aac75256b147c0c8e\": rpc error: code = NotFound desc = could not find container \"7785755ab9c2639f42f1f2d318c08fd3c381e3887af2585aac75256b147c0c8e\": container with ID starting with 7785755ab9c2639f42f1f2d318c08fd3c381e3887af2585aac75256b147c0c8e not found: ID does not exist" Feb 17 14:28:50 crc kubenswrapper[4804]: I0217 14:28:50.535117 4804 scope.go:117] "RemoveContainer" containerID="ba2eb522415e52153f760776df650e5d6f8b40d4bd8471402148ec4a34e5b336" Feb 17 14:28:50 crc kubenswrapper[4804]: E0217 14:28:50.535507 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba2eb522415e52153f760776df650e5d6f8b40d4bd8471402148ec4a34e5b336\": container with ID starting with ba2eb522415e52153f760776df650e5d6f8b40d4bd8471402148ec4a34e5b336 not found: ID does not exist" containerID="ba2eb522415e52153f760776df650e5d6f8b40d4bd8471402148ec4a34e5b336" Feb 17 14:28:50 crc kubenswrapper[4804]: I0217 14:28:50.535540 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba2eb522415e52153f760776df650e5d6f8b40d4bd8471402148ec4a34e5b336"} err="failed to get container status \"ba2eb522415e52153f760776df650e5d6f8b40d4bd8471402148ec4a34e5b336\": rpc error: code = NotFound desc = could not find container \"ba2eb522415e52153f760776df650e5d6f8b40d4bd8471402148ec4a34e5b336\": container with ID starting with ba2eb522415e52153f760776df650e5d6f8b40d4bd8471402148ec4a34e5b336 not found: ID does not exist" Feb 17 14:28:50 crc kubenswrapper[4804]: I0217 14:28:50.585983 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a" path="/var/lib/kubelet/pods/dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a/volumes" Feb 17 14:28:55 crc kubenswrapper[4804]: I0217 14:28:55.835580 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:28:55 crc kubenswrapper[4804]: I0217 14:28:55.836153 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:29:25 crc kubenswrapper[4804]: I0217 14:29:25.836309 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:29:25 crc kubenswrapper[4804]: I0217 14:29:25.836885 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:29:25 crc kubenswrapper[4804]: I0217 14:29:25.836937 4804 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 14:29:25 crc kubenswrapper[4804]: I0217 14:29:25.837934 4804 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db"} pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 14:29:25 crc kubenswrapper[4804]: I0217 14:29:25.837980 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" containerID="cri-o://f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" gracePeriod=600 Feb 17 14:29:25 crc kubenswrapper[4804]: E0217 14:29:25.971451 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:29:26 crc kubenswrapper[4804]: I0217 14:29:26.795803 4804 generic.go:334] "Generic (PLEG): container finished" podID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" exitCode=0 Feb 17 14:29:26 crc kubenswrapper[4804]: I0217 14:29:26.795881 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerDied","Data":"f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db"} Feb 17 14:29:26 crc kubenswrapper[4804]: I0217 14:29:26.796188 4804 scope.go:117] "RemoveContainer" containerID="7d4a85eb120e127ec0b1aabea32973bfd4724fea056face9a5b718c636d4f49a" Feb 17 14:29:26 crc kubenswrapper[4804]: I0217 14:29:26.796792 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:29:26 crc kubenswrapper[4804]: E0217 14:29:26.797068 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:29:36 crc kubenswrapper[4804]: I0217 14:29:36.359860 4804 scope.go:117] "RemoveContainer" containerID="937494af4bed6ea8a3c15ac225b0965ab5ea21328dc383082b45d9d125e0f418" Feb 17 14:29:40 crc kubenswrapper[4804]: I0217 14:29:40.574908 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:29:40 crc kubenswrapper[4804]: E0217 14:29:40.575801 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:29:54 crc kubenswrapper[4804]: I0217 14:29:54.574607 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:29:54 crc kubenswrapper[4804]: E0217 14:29:54.575361 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.195505 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt"] Feb 17 14:30:00 crc kubenswrapper[4804]: E0217 14:30:00.198076 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a" containerName="gather" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.198117 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a" containerName="gather" Feb 17 14:30:00 crc kubenswrapper[4804]: E0217 14:30:00.198144 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f377127-aca7-4b36-976b-fdc21aadd31b" containerName="extract-content" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.198153 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f377127-aca7-4b36-976b-fdc21aadd31b" containerName="extract-content" Feb 17 14:30:00 crc kubenswrapper[4804]: E0217 14:30:00.198167 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f377127-aca7-4b36-976b-fdc21aadd31b" containerName="extract-utilities" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.198175 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f377127-aca7-4b36-976b-fdc21aadd31b" containerName="extract-utilities" Feb 17 14:30:00 crc kubenswrapper[4804]: E0217 14:30:00.198217 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a" containerName="copy" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.198227 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a" containerName="copy" Feb 17 14:30:00 crc kubenswrapper[4804]: E0217 14:30:00.198241 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f377127-aca7-4b36-976b-fdc21aadd31b" containerName="registry-server" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.198249 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f377127-aca7-4b36-976b-fdc21aadd31b" containerName="registry-server" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.198478 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a" containerName="gather" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.198502 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f377127-aca7-4b36-976b-fdc21aadd31b" containerName="registry-server" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.198514 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfddc88e-4c36-4f3d-a75d-db8ba2c3fe7a" containerName="copy" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.199311 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.202108 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.202932 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.206782 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt"] Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.277862 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7fb08c68-ef85-4035-b769-a0b54926b503-secret-volume\") pod \"collect-profiles-29522310-stvxt\" (UID: \"7fb08c68-ef85-4035-b769-a0b54926b503\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.277919 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxblm\" (UniqueName: \"kubernetes.io/projected/7fb08c68-ef85-4035-b769-a0b54926b503-kube-api-access-fxblm\") pod \"collect-profiles-29522310-stvxt\" (UID: \"7fb08c68-ef85-4035-b769-a0b54926b503\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.277964 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7fb08c68-ef85-4035-b769-a0b54926b503-config-volume\") pod \"collect-profiles-29522310-stvxt\" (UID: \"7fb08c68-ef85-4035-b769-a0b54926b503\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.381877 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7fb08c68-ef85-4035-b769-a0b54926b503-secret-volume\") pod \"collect-profiles-29522310-stvxt\" (UID: \"7fb08c68-ef85-4035-b769-a0b54926b503\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.382147 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxblm\" (UniqueName: \"kubernetes.io/projected/7fb08c68-ef85-4035-b769-a0b54926b503-kube-api-access-fxblm\") pod \"collect-profiles-29522310-stvxt\" (UID: \"7fb08c68-ef85-4035-b769-a0b54926b503\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.382193 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7fb08c68-ef85-4035-b769-a0b54926b503-config-volume\") pod \"collect-profiles-29522310-stvxt\" (UID: \"7fb08c68-ef85-4035-b769-a0b54926b503\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.382979 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7fb08c68-ef85-4035-b769-a0b54926b503-config-volume\") pod \"collect-profiles-29522310-stvxt\" (UID: \"7fb08c68-ef85-4035-b769-a0b54926b503\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.396421 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7fb08c68-ef85-4035-b769-a0b54926b503-secret-volume\") pod \"collect-profiles-29522310-stvxt\" (UID: \"7fb08c68-ef85-4035-b769-a0b54926b503\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.409452 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxblm\" (UniqueName: \"kubernetes.io/projected/7fb08c68-ef85-4035-b769-a0b54926b503-kube-api-access-fxblm\") pod \"collect-profiles-29522310-stvxt\" (UID: \"7fb08c68-ef85-4035-b769-a0b54926b503\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt" Feb 17 14:30:00 crc kubenswrapper[4804]: I0217 14:30:00.520658 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt" Feb 17 14:30:01 crc kubenswrapper[4804]: I0217 14:30:01.010866 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt"] Feb 17 14:30:01 crc kubenswrapper[4804]: I0217 14:30:01.242929 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt" event={"ID":"7fb08c68-ef85-4035-b769-a0b54926b503","Type":"ContainerStarted","Data":"d73341e9e6582eba29e761aaf883d659674c8bff051eb57c84cf289f2ed6dee3"} Feb 17 14:30:01 crc kubenswrapper[4804]: I0217 14:30:01.243300 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt" event={"ID":"7fb08c68-ef85-4035-b769-a0b54926b503","Type":"ContainerStarted","Data":"465247d0d6d9048459debc68ffe300603683b95e428f418da14de8d8d23c31b3"} Feb 17 14:30:01 crc kubenswrapper[4804]: I0217 14:30:01.260583 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt" podStartSLOduration=1.260557652 podStartE2EDuration="1.260557652s" podCreationTimestamp="2026-02-17 14:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:30:01.257644801 +0000 UTC m=+3875.369064138" watchObservedRunningTime="2026-02-17 14:30:01.260557652 +0000 UTC m=+3875.371976989" Feb 17 14:30:02 crc kubenswrapper[4804]: I0217 14:30:02.253309 4804 generic.go:334] "Generic (PLEG): container finished" podID="7fb08c68-ef85-4035-b769-a0b54926b503" containerID="d73341e9e6582eba29e761aaf883d659674c8bff051eb57c84cf289f2ed6dee3" exitCode=0 Feb 17 14:30:02 crc kubenswrapper[4804]: I0217 14:30:02.253415 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt" event={"ID":"7fb08c68-ef85-4035-b769-a0b54926b503","Type":"ContainerDied","Data":"d73341e9e6582eba29e761aaf883d659674c8bff051eb57c84cf289f2ed6dee3"} Feb 17 14:30:03 crc kubenswrapper[4804]: I0217 14:30:03.579779 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt" Feb 17 14:30:03 crc kubenswrapper[4804]: I0217 14:30:03.650900 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7fb08c68-ef85-4035-b769-a0b54926b503-secret-volume\") pod \"7fb08c68-ef85-4035-b769-a0b54926b503\" (UID: \"7fb08c68-ef85-4035-b769-a0b54926b503\") " Feb 17 14:30:03 crc kubenswrapper[4804]: I0217 14:30:03.651174 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxblm\" (UniqueName: \"kubernetes.io/projected/7fb08c68-ef85-4035-b769-a0b54926b503-kube-api-access-fxblm\") pod \"7fb08c68-ef85-4035-b769-a0b54926b503\" (UID: \"7fb08c68-ef85-4035-b769-a0b54926b503\") " Feb 17 14:30:03 crc kubenswrapper[4804]: I0217 14:30:03.651316 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7fb08c68-ef85-4035-b769-a0b54926b503-config-volume\") pod \"7fb08c68-ef85-4035-b769-a0b54926b503\" (UID: \"7fb08c68-ef85-4035-b769-a0b54926b503\") " Feb 17 14:30:03 crc kubenswrapper[4804]: I0217 14:30:03.651957 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fb08c68-ef85-4035-b769-a0b54926b503-config-volume" (OuterVolumeSpecName: "config-volume") pod "7fb08c68-ef85-4035-b769-a0b54926b503" (UID: "7fb08c68-ef85-4035-b769-a0b54926b503"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:30:03 crc kubenswrapper[4804]: I0217 14:30:03.653736 4804 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7fb08c68-ef85-4035-b769-a0b54926b503-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:03 crc kubenswrapper[4804]: I0217 14:30:03.657383 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fb08c68-ef85-4035-b769-a0b54926b503-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7fb08c68-ef85-4035-b769-a0b54926b503" (UID: "7fb08c68-ef85-4035-b769-a0b54926b503"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:30:03 crc kubenswrapper[4804]: I0217 14:30:03.657403 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fb08c68-ef85-4035-b769-a0b54926b503-kube-api-access-fxblm" (OuterVolumeSpecName: "kube-api-access-fxblm") pod "7fb08c68-ef85-4035-b769-a0b54926b503" (UID: "7fb08c68-ef85-4035-b769-a0b54926b503"). InnerVolumeSpecName "kube-api-access-fxblm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:30:03 crc kubenswrapper[4804]: I0217 14:30:03.756249 4804 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7fb08c68-ef85-4035-b769-a0b54926b503-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:03 crc kubenswrapper[4804]: I0217 14:30:03.756281 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxblm\" (UniqueName: \"kubernetes.io/projected/7fb08c68-ef85-4035-b769-a0b54926b503-kube-api-access-fxblm\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:04 crc kubenswrapper[4804]: I0217 14:30:04.272821 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt" event={"ID":"7fb08c68-ef85-4035-b769-a0b54926b503","Type":"ContainerDied","Data":"465247d0d6d9048459debc68ffe300603683b95e428f418da14de8d8d23c31b3"} Feb 17 14:30:04 crc kubenswrapper[4804]: I0217 14:30:04.272855 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="465247d0d6d9048459debc68ffe300603683b95e428f418da14de8d8d23c31b3" Feb 17 14:30:04 crc kubenswrapper[4804]: I0217 14:30:04.272874 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-stvxt" Feb 17 14:30:04 crc kubenswrapper[4804]: I0217 14:30:04.361782 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs"] Feb 17 14:30:04 crc kubenswrapper[4804]: I0217 14:30:04.370298 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522265-8m8rs"] Feb 17 14:30:04 crc kubenswrapper[4804]: I0217 14:30:04.586116 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f" path="/var/lib/kubelet/pods/c77ee5ee-2b38-4a70-bc28-e2cdf625ab1f/volumes" Feb 17 14:30:07 crc kubenswrapper[4804]: I0217 14:30:07.574907 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:30:07 crc kubenswrapper[4804]: E0217 14:30:07.576661 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:30:18 crc kubenswrapper[4804]: I0217 14:30:18.574583 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:30:18 crc kubenswrapper[4804]: E0217 14:30:18.575515 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:30:30 crc kubenswrapper[4804]: I0217 14:30:30.745396 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:30:30 crc kubenswrapper[4804]: E0217 14:30:30.746705 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:30:36 crc kubenswrapper[4804]: I0217 14:30:36.426872 4804 scope.go:117] "RemoveContainer" containerID="96261c5dff8beaf5a66244a0c5555316f61e48042e355a630a22cedfabc69568" Feb 17 14:30:45 crc kubenswrapper[4804]: I0217 14:30:45.574316 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:30:45 crc kubenswrapper[4804]: E0217 14:30:45.574966 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:30:59 crc kubenswrapper[4804]: I0217 14:30:59.574152 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:30:59 crc kubenswrapper[4804]: E0217 14:30:59.574908 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:31:06 crc kubenswrapper[4804]: I0217 14:31:06.750299 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zn8mk"] Feb 17 14:31:06 crc kubenswrapper[4804]: E0217 14:31:06.751268 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fb08c68-ef85-4035-b769-a0b54926b503" containerName="collect-profiles" Feb 17 14:31:06 crc kubenswrapper[4804]: I0217 14:31:06.751286 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fb08c68-ef85-4035-b769-a0b54926b503" containerName="collect-profiles" Feb 17 14:31:06 crc kubenswrapper[4804]: I0217 14:31:06.751487 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fb08c68-ef85-4035-b769-a0b54926b503" containerName="collect-profiles" Feb 17 14:31:06 crc kubenswrapper[4804]: I0217 14:31:06.753124 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zn8mk" Feb 17 14:31:06 crc kubenswrapper[4804]: I0217 14:31:06.762232 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zn8mk"] Feb 17 14:31:06 crc kubenswrapper[4804]: I0217 14:31:06.789660 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsqw6\" (UniqueName: \"kubernetes.io/projected/3aa554a7-2c33-433d-89c1-403c44aa0215-kube-api-access-gsqw6\") pod \"redhat-operators-zn8mk\" (UID: \"3aa554a7-2c33-433d-89c1-403c44aa0215\") " pod="openshift-marketplace/redhat-operators-zn8mk" Feb 17 14:31:06 crc kubenswrapper[4804]: I0217 14:31:06.789740 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aa554a7-2c33-433d-89c1-403c44aa0215-utilities\") pod \"redhat-operators-zn8mk\" (UID: \"3aa554a7-2c33-433d-89c1-403c44aa0215\") " pod="openshift-marketplace/redhat-operators-zn8mk" Feb 17 14:31:06 crc kubenswrapper[4804]: I0217 14:31:06.789786 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aa554a7-2c33-433d-89c1-403c44aa0215-catalog-content\") pod \"redhat-operators-zn8mk\" (UID: \"3aa554a7-2c33-433d-89c1-403c44aa0215\") " pod="openshift-marketplace/redhat-operators-zn8mk" Feb 17 14:31:06 crc kubenswrapper[4804]: I0217 14:31:06.891167 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsqw6\" (UniqueName: \"kubernetes.io/projected/3aa554a7-2c33-433d-89c1-403c44aa0215-kube-api-access-gsqw6\") pod \"redhat-operators-zn8mk\" (UID: \"3aa554a7-2c33-433d-89c1-403c44aa0215\") " pod="openshift-marketplace/redhat-operators-zn8mk" Feb 17 14:31:06 crc kubenswrapper[4804]: I0217 14:31:06.891254 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aa554a7-2c33-433d-89c1-403c44aa0215-utilities\") pod \"redhat-operators-zn8mk\" (UID: \"3aa554a7-2c33-433d-89c1-403c44aa0215\") " pod="openshift-marketplace/redhat-operators-zn8mk" Feb 17 14:31:06 crc kubenswrapper[4804]: I0217 14:31:06.891333 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aa554a7-2c33-433d-89c1-403c44aa0215-catalog-content\") pod \"redhat-operators-zn8mk\" (UID: \"3aa554a7-2c33-433d-89c1-403c44aa0215\") " pod="openshift-marketplace/redhat-operators-zn8mk" Feb 17 14:31:06 crc kubenswrapper[4804]: I0217 14:31:06.891848 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aa554a7-2c33-433d-89c1-403c44aa0215-utilities\") pod \"redhat-operators-zn8mk\" (UID: \"3aa554a7-2c33-433d-89c1-403c44aa0215\") " pod="openshift-marketplace/redhat-operators-zn8mk" Feb 17 14:31:06 crc kubenswrapper[4804]: I0217 14:31:06.891875 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aa554a7-2c33-433d-89c1-403c44aa0215-catalog-content\") pod \"redhat-operators-zn8mk\" (UID: \"3aa554a7-2c33-433d-89c1-403c44aa0215\") " pod="openshift-marketplace/redhat-operators-zn8mk" Feb 17 14:31:06 crc kubenswrapper[4804]: I0217 14:31:06.912933 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsqw6\" (UniqueName: \"kubernetes.io/projected/3aa554a7-2c33-433d-89c1-403c44aa0215-kube-api-access-gsqw6\") pod \"redhat-operators-zn8mk\" (UID: \"3aa554a7-2c33-433d-89c1-403c44aa0215\") " pod="openshift-marketplace/redhat-operators-zn8mk" Feb 17 14:31:07 crc kubenswrapper[4804]: I0217 14:31:07.076378 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zn8mk" Feb 17 14:31:07 crc kubenswrapper[4804]: I0217 14:31:07.625959 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zn8mk"] Feb 17 14:31:08 crc kubenswrapper[4804]: E0217 14:31:08.018150 4804 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3aa554a7_2c33_433d_89c1_403c44aa0215.slice/crio-conmon-6d6f8bce427f488369830d0db771a2f0ebc9acafad3d11ad30fd973780d11e19.scope\": RecentStats: unable to find data in memory cache]" Feb 17 14:31:08 crc kubenswrapper[4804]: I0217 14:31:08.140847 4804 generic.go:334] "Generic (PLEG): container finished" podID="3aa554a7-2c33-433d-89c1-403c44aa0215" containerID="6d6f8bce427f488369830d0db771a2f0ebc9acafad3d11ad30fd973780d11e19" exitCode=0 Feb 17 14:31:08 crc kubenswrapper[4804]: I0217 14:31:08.140901 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zn8mk" event={"ID":"3aa554a7-2c33-433d-89c1-403c44aa0215","Type":"ContainerDied","Data":"6d6f8bce427f488369830d0db771a2f0ebc9acafad3d11ad30fd973780d11e19"} Feb 17 14:31:08 crc kubenswrapper[4804]: I0217 14:31:08.140932 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zn8mk" event={"ID":"3aa554a7-2c33-433d-89c1-403c44aa0215","Type":"ContainerStarted","Data":"341fac245a4b0d13d49b64dfe9663a0128241e83acbecdb03149d70c90f7a0a5"} Feb 17 14:31:08 crc kubenswrapper[4804]: I0217 14:31:08.142781 4804 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 14:31:14 crc kubenswrapper[4804]: I0217 14:31:14.574721 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:31:14 crc kubenswrapper[4804]: E0217 14:31:14.575743 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:31:17 crc kubenswrapper[4804]: I0217 14:31:17.224366 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zn8mk" event={"ID":"3aa554a7-2c33-433d-89c1-403c44aa0215","Type":"ContainerStarted","Data":"7c21f8597daa07cd863767e37bd4fca56d166a3f2cc1394deb226b8f50cefd20"} Feb 17 14:31:19 crc kubenswrapper[4804]: I0217 14:31:19.242846 4804 generic.go:334] "Generic (PLEG): container finished" podID="3aa554a7-2c33-433d-89c1-403c44aa0215" containerID="7c21f8597daa07cd863767e37bd4fca56d166a3f2cc1394deb226b8f50cefd20" exitCode=0 Feb 17 14:31:19 crc kubenswrapper[4804]: I0217 14:31:19.242955 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zn8mk" event={"ID":"3aa554a7-2c33-433d-89c1-403c44aa0215","Type":"ContainerDied","Data":"7c21f8597daa07cd863767e37bd4fca56d166a3f2cc1394deb226b8f50cefd20"} Feb 17 14:31:20 crc kubenswrapper[4804]: I0217 14:31:20.256977 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zn8mk" event={"ID":"3aa554a7-2c33-433d-89c1-403c44aa0215","Type":"ContainerStarted","Data":"c8b081f9fe887ef42d98a032a66e58fb0f063c9b148b47aefb5385b2ba5b192e"} Feb 17 14:31:20 crc kubenswrapper[4804]: I0217 14:31:20.275805 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zn8mk" podStartSLOduration=2.733228632 podStartE2EDuration="14.275785008s" podCreationTimestamp="2026-02-17 14:31:06 +0000 UTC" firstStartedPulling="2026-02-17 14:31:08.142583072 +0000 UTC m=+3942.254002409" lastFinishedPulling="2026-02-17 14:31:19.685139438 +0000 UTC m=+3953.796558785" observedRunningTime="2026-02-17 14:31:20.274781196 +0000 UTC m=+3954.386200543" watchObservedRunningTime="2026-02-17 14:31:20.275785008 +0000 UTC m=+3954.387204345" Feb 17 14:31:25 crc kubenswrapper[4804]: I0217 14:31:25.574418 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:31:25 crc kubenswrapper[4804]: E0217 14:31:25.575333 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:31:27 crc kubenswrapper[4804]: I0217 14:31:27.077827 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zn8mk" Feb 17 14:31:27 crc kubenswrapper[4804]: I0217 14:31:27.078168 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zn8mk" Feb 17 14:31:27 crc kubenswrapper[4804]: I0217 14:31:27.501284 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zn8mk" Feb 17 14:31:27 crc kubenswrapper[4804]: I0217 14:31:27.555328 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zn8mk" Feb 17 14:31:27 crc kubenswrapper[4804]: I0217 14:31:27.624425 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zn8mk"] Feb 17 14:31:27 crc kubenswrapper[4804]: I0217 14:31:27.744723 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bhcxz"] Feb 17 14:31:27 crc kubenswrapper[4804]: I0217 14:31:27.744978 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bhcxz" podUID="fdf90149-055d-48ca-9336-ca6d6545f8a3" containerName="registry-server" containerID="cri-o://99e2aa4e9ffd4764c886e89b267517bc69e0446a4dde7f269ace85ac34cf8bca" gracePeriod=2 Feb 17 14:31:28 crc kubenswrapper[4804]: I0217 14:31:28.338894 4804 generic.go:334] "Generic (PLEG): container finished" podID="fdf90149-055d-48ca-9336-ca6d6545f8a3" containerID="99e2aa4e9ffd4764c886e89b267517bc69e0446a4dde7f269ace85ac34cf8bca" exitCode=0 Feb 17 14:31:28 crc kubenswrapper[4804]: I0217 14:31:28.338966 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bhcxz" event={"ID":"fdf90149-055d-48ca-9336-ca6d6545f8a3","Type":"ContainerDied","Data":"99e2aa4e9ffd4764c886e89b267517bc69e0446a4dde7f269ace85ac34cf8bca"} Feb 17 14:31:28 crc kubenswrapper[4804]: I0217 14:31:28.339266 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bhcxz" event={"ID":"fdf90149-055d-48ca-9336-ca6d6545f8a3","Type":"ContainerDied","Data":"b36f1a9d3bb9bce0d65ed7aea0a70bcace69dc6992d2df01c00c6c4e740bc208"} Feb 17 14:31:28 crc kubenswrapper[4804]: I0217 14:31:28.339278 4804 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b36f1a9d3bb9bce0d65ed7aea0a70bcace69dc6992d2df01c00c6c4e740bc208" Feb 17 14:31:28 crc kubenswrapper[4804]: I0217 14:31:28.364340 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bhcxz" Feb 17 14:31:28 crc kubenswrapper[4804]: I0217 14:31:28.467959 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7rn4\" (UniqueName: \"kubernetes.io/projected/fdf90149-055d-48ca-9336-ca6d6545f8a3-kube-api-access-l7rn4\") pod \"fdf90149-055d-48ca-9336-ca6d6545f8a3\" (UID: \"fdf90149-055d-48ca-9336-ca6d6545f8a3\") " Feb 17 14:31:28 crc kubenswrapper[4804]: I0217 14:31:28.468065 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdf90149-055d-48ca-9336-ca6d6545f8a3-utilities\") pod \"fdf90149-055d-48ca-9336-ca6d6545f8a3\" (UID: \"fdf90149-055d-48ca-9336-ca6d6545f8a3\") " Feb 17 14:31:28 crc kubenswrapper[4804]: I0217 14:31:28.468158 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdf90149-055d-48ca-9336-ca6d6545f8a3-catalog-content\") pod \"fdf90149-055d-48ca-9336-ca6d6545f8a3\" (UID: \"fdf90149-055d-48ca-9336-ca6d6545f8a3\") " Feb 17 14:31:28 crc kubenswrapper[4804]: I0217 14:31:28.469523 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdf90149-055d-48ca-9336-ca6d6545f8a3-utilities" (OuterVolumeSpecName: "utilities") pod "fdf90149-055d-48ca-9336-ca6d6545f8a3" (UID: "fdf90149-055d-48ca-9336-ca6d6545f8a3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:31:28 crc kubenswrapper[4804]: I0217 14:31:28.477056 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdf90149-055d-48ca-9336-ca6d6545f8a3-kube-api-access-l7rn4" (OuterVolumeSpecName: "kube-api-access-l7rn4") pod "fdf90149-055d-48ca-9336-ca6d6545f8a3" (UID: "fdf90149-055d-48ca-9336-ca6d6545f8a3"). InnerVolumeSpecName "kube-api-access-l7rn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:28 crc kubenswrapper[4804]: I0217 14:31:28.570679 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7rn4\" (UniqueName: \"kubernetes.io/projected/fdf90149-055d-48ca-9336-ca6d6545f8a3-kube-api-access-l7rn4\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:28 crc kubenswrapper[4804]: I0217 14:31:28.570977 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdf90149-055d-48ca-9336-ca6d6545f8a3-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:28 crc kubenswrapper[4804]: I0217 14:31:28.607183 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdf90149-055d-48ca-9336-ca6d6545f8a3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fdf90149-055d-48ca-9336-ca6d6545f8a3" (UID: "fdf90149-055d-48ca-9336-ca6d6545f8a3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:31:28 crc kubenswrapper[4804]: I0217 14:31:28.674690 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdf90149-055d-48ca-9336-ca6d6545f8a3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:29 crc kubenswrapper[4804]: I0217 14:31:29.346355 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bhcxz" Feb 17 14:31:29 crc kubenswrapper[4804]: I0217 14:31:29.382096 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bhcxz"] Feb 17 14:31:29 crc kubenswrapper[4804]: I0217 14:31:29.398720 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bhcxz"] Feb 17 14:31:30 crc kubenswrapper[4804]: I0217 14:31:30.585922 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdf90149-055d-48ca-9336-ca6d6545f8a3" path="/var/lib/kubelet/pods/fdf90149-055d-48ca-9336-ca6d6545f8a3/volumes" Feb 17 14:31:36 crc kubenswrapper[4804]: I0217 14:31:36.493324 4804 scope.go:117] "RemoveContainer" containerID="99e2aa4e9ffd4764c886e89b267517bc69e0446a4dde7f269ace85ac34cf8bca" Feb 17 14:31:36 crc kubenswrapper[4804]: I0217 14:31:36.537024 4804 scope.go:117] "RemoveContainer" containerID="655e7850618eb7f1a6d3ae03ba1313c40721cf71550535d385a4aa123058d615" Feb 17 14:31:36 crc kubenswrapper[4804]: I0217 14:31:36.564365 4804 scope.go:117] "RemoveContainer" containerID="aec9aafaeb0231fd50b93156ef23ec8d4f34ac9ec3ae7c91631e24543663c093" Feb 17 14:31:40 crc kubenswrapper[4804]: I0217 14:31:40.573895 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:31:40 crc kubenswrapper[4804]: E0217 14:31:40.574591 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:31:48 crc kubenswrapper[4804]: I0217 14:31:48.660433 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6mdsd/must-gather-64wjc"] Feb 17 14:31:48 crc kubenswrapper[4804]: E0217 14:31:48.661484 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdf90149-055d-48ca-9336-ca6d6545f8a3" containerName="extract-utilities" Feb 17 14:31:48 crc kubenswrapper[4804]: I0217 14:31:48.661500 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdf90149-055d-48ca-9336-ca6d6545f8a3" containerName="extract-utilities" Feb 17 14:31:48 crc kubenswrapper[4804]: E0217 14:31:48.661534 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdf90149-055d-48ca-9336-ca6d6545f8a3" containerName="extract-content" Feb 17 14:31:48 crc kubenswrapper[4804]: I0217 14:31:48.661543 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdf90149-055d-48ca-9336-ca6d6545f8a3" containerName="extract-content" Feb 17 14:31:48 crc kubenswrapper[4804]: E0217 14:31:48.661561 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdf90149-055d-48ca-9336-ca6d6545f8a3" containerName="registry-server" Feb 17 14:31:48 crc kubenswrapper[4804]: I0217 14:31:48.661569 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdf90149-055d-48ca-9336-ca6d6545f8a3" containerName="registry-server" Feb 17 14:31:48 crc kubenswrapper[4804]: I0217 14:31:48.661814 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdf90149-055d-48ca-9336-ca6d6545f8a3" containerName="registry-server" Feb 17 14:31:48 crc kubenswrapper[4804]: I0217 14:31:48.663044 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6mdsd/must-gather-64wjc" Feb 17 14:31:48 crc kubenswrapper[4804]: I0217 14:31:48.665248 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6mdsd"/"openshift-service-ca.crt" Feb 17 14:31:48 crc kubenswrapper[4804]: I0217 14:31:48.665505 4804 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-6mdsd"/"default-dockercfg-dhrsb" Feb 17 14:31:48 crc kubenswrapper[4804]: I0217 14:31:48.665606 4804 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6mdsd"/"kube-root-ca.crt" Feb 17 14:31:48 crc kubenswrapper[4804]: I0217 14:31:48.671474 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6mdsd/must-gather-64wjc"] Feb 17 14:31:48 crc kubenswrapper[4804]: I0217 14:31:48.693580 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/de9029fd-fb98-4bf0-a6fc-0baf663a4e92-must-gather-output\") pod \"must-gather-64wjc\" (UID: \"de9029fd-fb98-4bf0-a6fc-0baf663a4e92\") " pod="openshift-must-gather-6mdsd/must-gather-64wjc" Feb 17 14:31:48 crc kubenswrapper[4804]: I0217 14:31:48.693878 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xth5r\" (UniqueName: \"kubernetes.io/projected/de9029fd-fb98-4bf0-a6fc-0baf663a4e92-kube-api-access-xth5r\") pod \"must-gather-64wjc\" (UID: \"de9029fd-fb98-4bf0-a6fc-0baf663a4e92\") " pod="openshift-must-gather-6mdsd/must-gather-64wjc" Feb 17 14:31:48 crc kubenswrapper[4804]: I0217 14:31:48.795560 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/de9029fd-fb98-4bf0-a6fc-0baf663a4e92-must-gather-output\") pod \"must-gather-64wjc\" (UID: \"de9029fd-fb98-4bf0-a6fc-0baf663a4e92\") " pod="openshift-must-gather-6mdsd/must-gather-64wjc" Feb 17 14:31:48 crc kubenswrapper[4804]: I0217 14:31:48.795607 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xth5r\" (UniqueName: \"kubernetes.io/projected/de9029fd-fb98-4bf0-a6fc-0baf663a4e92-kube-api-access-xth5r\") pod \"must-gather-64wjc\" (UID: \"de9029fd-fb98-4bf0-a6fc-0baf663a4e92\") " pod="openshift-must-gather-6mdsd/must-gather-64wjc" Feb 17 14:31:48 crc kubenswrapper[4804]: I0217 14:31:48.796204 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/de9029fd-fb98-4bf0-a6fc-0baf663a4e92-must-gather-output\") pod \"must-gather-64wjc\" (UID: \"de9029fd-fb98-4bf0-a6fc-0baf663a4e92\") " pod="openshift-must-gather-6mdsd/must-gather-64wjc" Feb 17 14:31:48 crc kubenswrapper[4804]: I0217 14:31:48.817352 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xth5r\" (UniqueName: \"kubernetes.io/projected/de9029fd-fb98-4bf0-a6fc-0baf663a4e92-kube-api-access-xth5r\") pod \"must-gather-64wjc\" (UID: \"de9029fd-fb98-4bf0-a6fc-0baf663a4e92\") " pod="openshift-must-gather-6mdsd/must-gather-64wjc" Feb 17 14:31:48 crc kubenswrapper[4804]: I0217 14:31:48.979715 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6mdsd/must-gather-64wjc" Feb 17 14:31:49 crc kubenswrapper[4804]: I0217 14:31:49.448420 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6mdsd/must-gather-64wjc"] Feb 17 14:31:49 crc kubenswrapper[4804]: W0217 14:31:49.453661 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde9029fd_fb98_4bf0_a6fc_0baf663a4e92.slice/crio-0f10fabdb124ae129b670cabd84de1d0518943d7f62e87c77537f6f81cb52341 WatchSource:0}: Error finding container 0f10fabdb124ae129b670cabd84de1d0518943d7f62e87c77537f6f81cb52341: Status 404 returned error can't find the container with id 0f10fabdb124ae129b670cabd84de1d0518943d7f62e87c77537f6f81cb52341 Feb 17 14:31:49 crc kubenswrapper[4804]: I0217 14:31:49.519562 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6mdsd/must-gather-64wjc" event={"ID":"de9029fd-fb98-4bf0-a6fc-0baf663a4e92","Type":"ContainerStarted","Data":"0f10fabdb124ae129b670cabd84de1d0518943d7f62e87c77537f6f81cb52341"} Feb 17 14:31:50 crc kubenswrapper[4804]: I0217 14:31:50.532175 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6mdsd/must-gather-64wjc" event={"ID":"de9029fd-fb98-4bf0-a6fc-0baf663a4e92","Type":"ContainerStarted","Data":"ba8a99b1d53310cd598e93015ecdc8bff1c0871f8d9af2216aa4262da6b1fde1"} Feb 17 14:31:51 crc kubenswrapper[4804]: I0217 14:31:51.542190 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6mdsd/must-gather-64wjc" event={"ID":"de9029fd-fb98-4bf0-a6fc-0baf663a4e92","Type":"ContainerStarted","Data":"8a5d5495b17851f93d14861ab3120bb7a96ba669e31d998c9788362daafedc67"} Feb 17 14:31:51 crc kubenswrapper[4804]: I0217 14:31:51.567522 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6mdsd/must-gather-64wjc" podStartSLOduration=3.567475767 podStartE2EDuration="3.567475767s" podCreationTimestamp="2026-02-17 14:31:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:31:51.557266717 +0000 UTC m=+3985.668686054" watchObservedRunningTime="2026-02-17 14:31:51.567475767 +0000 UTC m=+3985.678895104" Feb 17 14:31:54 crc kubenswrapper[4804]: I0217 14:31:54.338691 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6mdsd/crc-debug-m6kvm"] Feb 17 14:31:54 crc kubenswrapper[4804]: I0217 14:31:54.344186 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6mdsd/crc-debug-m6kvm" Feb 17 14:31:54 crc kubenswrapper[4804]: I0217 14:31:54.413409 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6-host\") pod \"crc-debug-m6kvm\" (UID: \"e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6\") " pod="openshift-must-gather-6mdsd/crc-debug-m6kvm" Feb 17 14:31:54 crc kubenswrapper[4804]: I0217 14:31:54.413706 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncv6n\" (UniqueName: \"kubernetes.io/projected/e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6-kube-api-access-ncv6n\") pod \"crc-debug-m6kvm\" (UID: \"e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6\") " pod="openshift-must-gather-6mdsd/crc-debug-m6kvm" Feb 17 14:31:54 crc kubenswrapper[4804]: I0217 14:31:54.514943 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6-host\") pod \"crc-debug-m6kvm\" (UID: \"e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6\") " pod="openshift-must-gather-6mdsd/crc-debug-m6kvm" Feb 17 14:31:54 crc kubenswrapper[4804]: I0217 14:31:54.515002 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncv6n\" (UniqueName: \"kubernetes.io/projected/e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6-kube-api-access-ncv6n\") pod \"crc-debug-m6kvm\" (UID: \"e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6\") " pod="openshift-must-gather-6mdsd/crc-debug-m6kvm" Feb 17 14:31:54 crc kubenswrapper[4804]: I0217 14:31:54.515144 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6-host\") pod \"crc-debug-m6kvm\" (UID: \"e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6\") " pod="openshift-must-gather-6mdsd/crc-debug-m6kvm" Feb 17 14:31:54 crc kubenswrapper[4804]: I0217 14:31:54.543314 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncv6n\" (UniqueName: \"kubernetes.io/projected/e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6-kube-api-access-ncv6n\") pod \"crc-debug-m6kvm\" (UID: \"e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6\") " pod="openshift-must-gather-6mdsd/crc-debug-m6kvm" Feb 17 14:31:54 crc kubenswrapper[4804]: I0217 14:31:54.573733 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:31:54 crc kubenswrapper[4804]: E0217 14:31:54.574021 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:31:54 crc kubenswrapper[4804]: I0217 14:31:54.673868 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6mdsd/crc-debug-m6kvm" Feb 17 14:31:54 crc kubenswrapper[4804]: W0217 14:31:54.728931 4804 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6c19278_b1a0_4a84_b3c5_70b6cc6ad7f6.slice/crio-9068d964985dc3f74636fc38444e9611dd2eacf2629fc94b18531687a7e7c765 WatchSource:0}: Error finding container 9068d964985dc3f74636fc38444e9611dd2eacf2629fc94b18531687a7e7c765: Status 404 returned error can't find the container with id 9068d964985dc3f74636fc38444e9611dd2eacf2629fc94b18531687a7e7c765 Feb 17 14:31:55 crc kubenswrapper[4804]: I0217 14:31:55.594874 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6mdsd/crc-debug-m6kvm" event={"ID":"e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6","Type":"ContainerStarted","Data":"48ccee6d738eec52fe3ebe02fb9da09777623625ef4e66906d7c6643e3e9b779"} Feb 17 14:31:55 crc kubenswrapper[4804]: I0217 14:31:55.595400 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6mdsd/crc-debug-m6kvm" event={"ID":"e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6","Type":"ContainerStarted","Data":"9068d964985dc3f74636fc38444e9611dd2eacf2629fc94b18531687a7e7c765"} Feb 17 14:31:55 crc kubenswrapper[4804]: I0217 14:31:55.614048 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6mdsd/crc-debug-m6kvm" podStartSLOduration=1.614030247 podStartE2EDuration="1.614030247s" podCreationTimestamp="2026-02-17 14:31:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:31:55.605882212 +0000 UTC m=+3989.717301549" watchObservedRunningTime="2026-02-17 14:31:55.614030247 +0000 UTC m=+3989.725449584" Feb 17 14:32:07 crc kubenswrapper[4804]: I0217 14:32:07.574240 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:32:07 crc kubenswrapper[4804]: E0217 14:32:07.575154 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:32:19 crc kubenswrapper[4804]: I0217 14:32:19.573789 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:32:19 crc kubenswrapper[4804]: E0217 14:32:19.574672 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:32:28 crc kubenswrapper[4804]: I0217 14:32:28.909944 4804 generic.go:334] "Generic (PLEG): container finished" podID="e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6" containerID="48ccee6d738eec52fe3ebe02fb9da09777623625ef4e66906d7c6643e3e9b779" exitCode=0 Feb 17 14:32:28 crc kubenswrapper[4804]: I0217 14:32:28.910138 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6mdsd/crc-debug-m6kvm" event={"ID":"e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6","Type":"ContainerDied","Data":"48ccee6d738eec52fe3ebe02fb9da09777623625ef4e66906d7c6643e3e9b779"} Feb 17 14:32:30 crc kubenswrapper[4804]: I0217 14:32:30.177914 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6mdsd/crc-debug-m6kvm" Feb 17 14:32:30 crc kubenswrapper[4804]: I0217 14:32:30.214685 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6mdsd/crc-debug-m6kvm"] Feb 17 14:32:30 crc kubenswrapper[4804]: I0217 14:32:30.223245 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6mdsd/crc-debug-m6kvm"] Feb 17 14:32:30 crc kubenswrapper[4804]: I0217 14:32:30.225743 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6-host\") pod \"e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6\" (UID: \"e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6\") " Feb 17 14:32:30 crc kubenswrapper[4804]: I0217 14:32:30.225808 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncv6n\" (UniqueName: \"kubernetes.io/projected/e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6-kube-api-access-ncv6n\") pod \"e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6\" (UID: \"e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6\") " Feb 17 14:32:30 crc kubenswrapper[4804]: I0217 14:32:30.227422 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6-host" (OuterVolumeSpecName: "host") pod "e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6" (UID: "e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:32:30 crc kubenswrapper[4804]: I0217 14:32:30.231812 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6-kube-api-access-ncv6n" (OuterVolumeSpecName: "kube-api-access-ncv6n") pod "e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6" (UID: "e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6"). InnerVolumeSpecName "kube-api-access-ncv6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:32:30 crc kubenswrapper[4804]: I0217 14:32:30.328030 4804 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6-host\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:30 crc kubenswrapper[4804]: I0217 14:32:30.328072 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncv6n\" (UniqueName: \"kubernetes.io/projected/e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6-kube-api-access-ncv6n\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:30 crc kubenswrapper[4804]: I0217 14:32:30.584468 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6" path="/var/lib/kubelet/pods/e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6/volumes" Feb 17 14:32:30 crc kubenswrapper[4804]: I0217 14:32:30.942527 4804 scope.go:117] "RemoveContainer" containerID="48ccee6d738eec52fe3ebe02fb9da09777623625ef4e66906d7c6643e3e9b779" Feb 17 14:32:30 crc kubenswrapper[4804]: I0217 14:32:30.942676 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6mdsd/crc-debug-m6kvm" Feb 17 14:32:31 crc kubenswrapper[4804]: I0217 14:32:31.444740 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6mdsd/crc-debug-2pwmf"] Feb 17 14:32:31 crc kubenswrapper[4804]: E0217 14:32:31.445555 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6" containerName="container-00" Feb 17 14:32:31 crc kubenswrapper[4804]: I0217 14:32:31.445574 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6" containerName="container-00" Feb 17 14:32:31 crc kubenswrapper[4804]: I0217 14:32:31.445768 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6c19278-b1a0-4a84-b3c5-70b6cc6ad7f6" containerName="container-00" Feb 17 14:32:31 crc kubenswrapper[4804]: I0217 14:32:31.446416 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6mdsd/crc-debug-2pwmf" Feb 17 14:32:31 crc kubenswrapper[4804]: I0217 14:32:31.555442 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/768fb954-46e9-4df8-89f9-b20f65c39f9e-host\") pod \"crc-debug-2pwmf\" (UID: \"768fb954-46e9-4df8-89f9-b20f65c39f9e\") " pod="openshift-must-gather-6mdsd/crc-debug-2pwmf" Feb 17 14:32:31 crc kubenswrapper[4804]: I0217 14:32:31.555574 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvc5t\" (UniqueName: \"kubernetes.io/projected/768fb954-46e9-4df8-89f9-b20f65c39f9e-kube-api-access-wvc5t\") pod \"crc-debug-2pwmf\" (UID: \"768fb954-46e9-4df8-89f9-b20f65c39f9e\") " pod="openshift-must-gather-6mdsd/crc-debug-2pwmf" Feb 17 14:32:31 crc kubenswrapper[4804]: I0217 14:32:31.658043 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/768fb954-46e9-4df8-89f9-b20f65c39f9e-host\") pod \"crc-debug-2pwmf\" (UID: \"768fb954-46e9-4df8-89f9-b20f65c39f9e\") " pod="openshift-must-gather-6mdsd/crc-debug-2pwmf" Feb 17 14:32:31 crc kubenswrapper[4804]: I0217 14:32:31.658183 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/768fb954-46e9-4df8-89f9-b20f65c39f9e-host\") pod \"crc-debug-2pwmf\" (UID: \"768fb954-46e9-4df8-89f9-b20f65c39f9e\") " pod="openshift-must-gather-6mdsd/crc-debug-2pwmf" Feb 17 14:32:31 crc kubenswrapper[4804]: I0217 14:32:31.658187 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvc5t\" (UniqueName: \"kubernetes.io/projected/768fb954-46e9-4df8-89f9-b20f65c39f9e-kube-api-access-wvc5t\") pod \"crc-debug-2pwmf\" (UID: \"768fb954-46e9-4df8-89f9-b20f65c39f9e\") " pod="openshift-must-gather-6mdsd/crc-debug-2pwmf" Feb 17 14:32:31 crc kubenswrapper[4804]: I0217 14:32:31.688897 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvc5t\" (UniqueName: \"kubernetes.io/projected/768fb954-46e9-4df8-89f9-b20f65c39f9e-kube-api-access-wvc5t\") pod \"crc-debug-2pwmf\" (UID: \"768fb954-46e9-4df8-89f9-b20f65c39f9e\") " pod="openshift-must-gather-6mdsd/crc-debug-2pwmf" Feb 17 14:32:31 crc kubenswrapper[4804]: I0217 14:32:31.772314 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6mdsd/crc-debug-2pwmf" Feb 17 14:32:31 crc kubenswrapper[4804]: I0217 14:32:31.953285 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6mdsd/crc-debug-2pwmf" event={"ID":"768fb954-46e9-4df8-89f9-b20f65c39f9e","Type":"ContainerStarted","Data":"989bfe1a7fa274157f4057abd8aa1fc74de862c999a0d219cb684917d6013e54"} Feb 17 14:32:32 crc kubenswrapper[4804]: I0217 14:32:32.963251 4804 generic.go:334] "Generic (PLEG): container finished" podID="768fb954-46e9-4df8-89f9-b20f65c39f9e" containerID="c72740c43ea69de8ef43e8dc6df1e3ebad0283045c01f7e18f815758350cdc07" exitCode=0 Feb 17 14:32:32 crc kubenswrapper[4804]: I0217 14:32:32.963370 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6mdsd/crc-debug-2pwmf" event={"ID":"768fb954-46e9-4df8-89f9-b20f65c39f9e","Type":"ContainerDied","Data":"c72740c43ea69de8ef43e8dc6df1e3ebad0283045c01f7e18f815758350cdc07"} Feb 17 14:32:33 crc kubenswrapper[4804]: I0217 14:32:33.408650 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6mdsd/crc-debug-2pwmf"] Feb 17 14:32:33 crc kubenswrapper[4804]: I0217 14:32:33.423467 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6mdsd/crc-debug-2pwmf"] Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.114338 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6mdsd/crc-debug-2pwmf" Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.205003 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/768fb954-46e9-4df8-89f9-b20f65c39f9e-host\") pod \"768fb954-46e9-4df8-89f9-b20f65c39f9e\" (UID: \"768fb954-46e9-4df8-89f9-b20f65c39f9e\") " Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.205158 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/768fb954-46e9-4df8-89f9-b20f65c39f9e-host" (OuterVolumeSpecName: "host") pod "768fb954-46e9-4df8-89f9-b20f65c39f9e" (UID: "768fb954-46e9-4df8-89f9-b20f65c39f9e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.205406 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvc5t\" (UniqueName: \"kubernetes.io/projected/768fb954-46e9-4df8-89f9-b20f65c39f9e-kube-api-access-wvc5t\") pod \"768fb954-46e9-4df8-89f9-b20f65c39f9e\" (UID: \"768fb954-46e9-4df8-89f9-b20f65c39f9e\") " Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.206048 4804 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/768fb954-46e9-4df8-89f9-b20f65c39f9e-host\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.210786 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/768fb954-46e9-4df8-89f9-b20f65c39f9e-kube-api-access-wvc5t" (OuterVolumeSpecName: "kube-api-access-wvc5t") pod "768fb954-46e9-4df8-89f9-b20f65c39f9e" (UID: "768fb954-46e9-4df8-89f9-b20f65c39f9e"). InnerVolumeSpecName "kube-api-access-wvc5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.308310 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvc5t\" (UniqueName: \"kubernetes.io/projected/768fb954-46e9-4df8-89f9-b20f65c39f9e-kube-api-access-wvc5t\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.573898 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:32:34 crc kubenswrapper[4804]: E0217 14:32:34.574256 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.585924 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="768fb954-46e9-4df8-89f9-b20f65c39f9e" path="/var/lib/kubelet/pods/768fb954-46e9-4df8-89f9-b20f65c39f9e/volumes" Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.742290 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6mdsd/crc-debug-66jj7"] Feb 17 14:32:34 crc kubenswrapper[4804]: E0217 14:32:34.742679 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="768fb954-46e9-4df8-89f9-b20f65c39f9e" containerName="container-00" Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.742697 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="768fb954-46e9-4df8-89f9-b20f65c39f9e" containerName="container-00" Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.742880 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="768fb954-46e9-4df8-89f9-b20f65c39f9e" containerName="container-00" Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.743490 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6mdsd/crc-debug-66jj7" Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.817808 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb9x5\" (UniqueName: \"kubernetes.io/projected/01160288-3510-4001-8a02-c356f2b354f1-kube-api-access-zb9x5\") pod \"crc-debug-66jj7\" (UID: \"01160288-3510-4001-8a02-c356f2b354f1\") " pod="openshift-must-gather-6mdsd/crc-debug-66jj7" Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.817905 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/01160288-3510-4001-8a02-c356f2b354f1-host\") pod \"crc-debug-66jj7\" (UID: \"01160288-3510-4001-8a02-c356f2b354f1\") " pod="openshift-must-gather-6mdsd/crc-debug-66jj7" Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.919181 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/01160288-3510-4001-8a02-c356f2b354f1-host\") pod \"crc-debug-66jj7\" (UID: \"01160288-3510-4001-8a02-c356f2b354f1\") " pod="openshift-must-gather-6mdsd/crc-debug-66jj7" Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.919377 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb9x5\" (UniqueName: \"kubernetes.io/projected/01160288-3510-4001-8a02-c356f2b354f1-kube-api-access-zb9x5\") pod \"crc-debug-66jj7\" (UID: \"01160288-3510-4001-8a02-c356f2b354f1\") " pod="openshift-must-gather-6mdsd/crc-debug-66jj7" Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.919422 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/01160288-3510-4001-8a02-c356f2b354f1-host\") pod \"crc-debug-66jj7\" (UID: \"01160288-3510-4001-8a02-c356f2b354f1\") " pod="openshift-must-gather-6mdsd/crc-debug-66jj7" Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.939529 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb9x5\" (UniqueName: \"kubernetes.io/projected/01160288-3510-4001-8a02-c356f2b354f1-kube-api-access-zb9x5\") pod \"crc-debug-66jj7\" (UID: \"01160288-3510-4001-8a02-c356f2b354f1\") " pod="openshift-must-gather-6mdsd/crc-debug-66jj7" Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.986129 4804 scope.go:117] "RemoveContainer" containerID="c72740c43ea69de8ef43e8dc6df1e3ebad0283045c01f7e18f815758350cdc07" Feb 17 14:32:34 crc kubenswrapper[4804]: I0217 14:32:34.986176 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6mdsd/crc-debug-2pwmf" Feb 17 14:32:35 crc kubenswrapper[4804]: I0217 14:32:35.059656 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6mdsd/crc-debug-66jj7" Feb 17 14:32:35 crc kubenswrapper[4804]: I0217 14:32:35.995339 4804 generic.go:334] "Generic (PLEG): container finished" podID="01160288-3510-4001-8a02-c356f2b354f1" containerID="e3cf401ef670ad8e345476461ab15c106659c4047295fe10a7113e202c7d1745" exitCode=0 Feb 17 14:32:35 crc kubenswrapper[4804]: I0217 14:32:35.995780 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6mdsd/crc-debug-66jj7" event={"ID":"01160288-3510-4001-8a02-c356f2b354f1","Type":"ContainerDied","Data":"e3cf401ef670ad8e345476461ab15c106659c4047295fe10a7113e202c7d1745"} Feb 17 14:32:35 crc kubenswrapper[4804]: I0217 14:32:35.995805 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6mdsd/crc-debug-66jj7" event={"ID":"01160288-3510-4001-8a02-c356f2b354f1","Type":"ContainerStarted","Data":"2c574eafadbda7ee7470a18eebfb317e6ae81c8e1704b0d945ce0de7b25c2705"} Feb 17 14:32:36 crc kubenswrapper[4804]: I0217 14:32:36.037651 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6mdsd/crc-debug-66jj7"] Feb 17 14:32:36 crc kubenswrapper[4804]: I0217 14:32:36.047534 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6mdsd/crc-debug-66jj7"] Feb 17 14:32:37 crc kubenswrapper[4804]: I0217 14:32:37.106340 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6mdsd/crc-debug-66jj7" Feb 17 14:32:37 crc kubenswrapper[4804]: I0217 14:32:37.160772 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/01160288-3510-4001-8a02-c356f2b354f1-host\") pod \"01160288-3510-4001-8a02-c356f2b354f1\" (UID: \"01160288-3510-4001-8a02-c356f2b354f1\") " Feb 17 14:32:37 crc kubenswrapper[4804]: I0217 14:32:37.161127 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zb9x5\" (UniqueName: \"kubernetes.io/projected/01160288-3510-4001-8a02-c356f2b354f1-kube-api-access-zb9x5\") pod \"01160288-3510-4001-8a02-c356f2b354f1\" (UID: \"01160288-3510-4001-8a02-c356f2b354f1\") " Feb 17 14:32:37 crc kubenswrapper[4804]: I0217 14:32:37.160855 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01160288-3510-4001-8a02-c356f2b354f1-host" (OuterVolumeSpecName: "host") pod "01160288-3510-4001-8a02-c356f2b354f1" (UID: "01160288-3510-4001-8a02-c356f2b354f1"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:32:37 crc kubenswrapper[4804]: I0217 14:32:37.161722 4804 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/01160288-3510-4001-8a02-c356f2b354f1-host\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:37 crc kubenswrapper[4804]: I0217 14:32:37.165905 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01160288-3510-4001-8a02-c356f2b354f1-kube-api-access-zb9x5" (OuterVolumeSpecName: "kube-api-access-zb9x5") pod "01160288-3510-4001-8a02-c356f2b354f1" (UID: "01160288-3510-4001-8a02-c356f2b354f1"). InnerVolumeSpecName "kube-api-access-zb9x5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:32:37 crc kubenswrapper[4804]: I0217 14:32:37.263616 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zb9x5\" (UniqueName: \"kubernetes.io/projected/01160288-3510-4001-8a02-c356f2b354f1-kube-api-access-zb9x5\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:38 crc kubenswrapper[4804]: I0217 14:32:38.015572 4804 scope.go:117] "RemoveContainer" containerID="e3cf401ef670ad8e345476461ab15c106659c4047295fe10a7113e202c7d1745" Feb 17 14:32:38 crc kubenswrapper[4804]: I0217 14:32:38.015731 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6mdsd/crc-debug-66jj7" Feb 17 14:32:38 crc kubenswrapper[4804]: I0217 14:32:38.586645 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01160288-3510-4001-8a02-c356f2b354f1" path="/var/lib/kubelet/pods/01160288-3510-4001-8a02-c356f2b354f1/volumes" Feb 17 14:32:45 crc kubenswrapper[4804]: I0217 14:32:45.574157 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:32:45 crc kubenswrapper[4804]: E0217 14:32:45.574885 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:32:59 crc kubenswrapper[4804]: I0217 14:32:59.573991 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:32:59 crc kubenswrapper[4804]: E0217 14:32:59.574759 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:33:12 crc kubenswrapper[4804]: I0217 14:33:12.430984 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7cc7c97fdd-bhd7w_2b89da32-9537-4c7b-a266-0d38ac52b069/barbican-api/0.log" Feb 17 14:33:12 crc kubenswrapper[4804]: I0217 14:33:12.459523 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7cc7c97fdd-bhd7w_2b89da32-9537-4c7b-a266-0d38ac52b069/barbican-api-log/0.log" Feb 17 14:33:12 crc kubenswrapper[4804]: I0217 14:33:12.639587 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-f46489f4-x24zj_297a0648-3cbd-4f1e-8bc4-d918a702c33b/barbican-keystone-listener/0.log" Feb 17 14:33:12 crc kubenswrapper[4804]: I0217 14:33:12.657916 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-f46489f4-x24zj_297a0648-3cbd-4f1e-8bc4-d918a702c33b/barbican-keystone-listener-log/0.log" Feb 17 14:33:12 crc kubenswrapper[4804]: I0217 14:33:12.705677 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5f97f9545f-tngcj_c7f4e4c3-9ec8-4923-bf7b-4058899e863f/barbican-worker/0.log" Feb 17 14:33:12 crc kubenswrapper[4804]: I0217 14:33:12.824834 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5f97f9545f-tngcj_c7f4e4c3-9ec8-4923-bf7b-4058899e863f/barbican-worker-log/0.log" Feb 17 14:33:12 crc kubenswrapper[4804]: I0217 14:33:12.907014 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-bv26p_9ee075c2-2363-4446-8545-dfdece6ca4da/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:33:13 crc kubenswrapper[4804]: I0217 14:33:13.038945 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_39bfc426-b9af-40b4-a713-26bb2366db7a/ceilometer-central-agent/0.log" Feb 17 14:33:13 crc kubenswrapper[4804]: I0217 14:33:13.110734 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_39bfc426-b9af-40b4-a713-26bb2366db7a/ceilometer-notification-agent/0.log" Feb 17 14:33:13 crc kubenswrapper[4804]: I0217 14:33:13.119982 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_39bfc426-b9af-40b4-a713-26bb2366db7a/proxy-httpd/0.log" Feb 17 14:33:13 crc kubenswrapper[4804]: I0217 14:33:13.195382 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_39bfc426-b9af-40b4-a713-26bb2366db7a/sg-core/0.log" Feb 17 14:33:13 crc kubenswrapper[4804]: I0217 14:33:13.337726 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92/cinder-api-log/0.log" Feb 17 14:33:13 crc kubenswrapper[4804]: I0217 14:33:13.416593 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_271eb4b4-5e4f-4ab8-8ce0-8ad5633a6a92/cinder-api/0.log" Feb 17 14:33:13 crc kubenswrapper[4804]: I0217 14:33:13.580087 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_f7170af0-a08f-4b96-b93a-5353d633a82f/cinder-scheduler/0.log" Feb 17 14:33:13 crc kubenswrapper[4804]: I0217 14:33:13.608820 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_f7170af0-a08f-4b96-b93a-5353d633a82f/probe/0.log" Feb 17 14:33:14 crc kubenswrapper[4804]: I0217 14:33:14.083308 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-499xq_5c4e88aa-842f-453a-9ce9-8354c16340e9/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:33:14 crc kubenswrapper[4804]: I0217 14:33:14.149067 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-vr7tq_5ca70007-e938-4bd5-9f2a-66f18b87743a/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:33:14 crc kubenswrapper[4804]: I0217 14:33:14.286341 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-2n5kn_69619ab8-5a40-43b9-8e9c-1a6e39893605/init/0.log" Feb 17 14:33:14 crc kubenswrapper[4804]: I0217 14:33:14.445936 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-2n5kn_69619ab8-5a40-43b9-8e9c-1a6e39893605/init/0.log" Feb 17 14:33:14 crc kubenswrapper[4804]: I0217 14:33:14.553954 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-pcpjc_5ecc3e55-21c0-4017-8dce-9c77fd2189ea/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:33:14 crc kubenswrapper[4804]: I0217 14:33:14.575676 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:33:14 crc kubenswrapper[4804]: E0217 14:33:14.575997 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:33:14 crc kubenswrapper[4804]: I0217 14:33:14.587514 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-2n5kn_69619ab8-5a40-43b9-8e9c-1a6e39893605/dnsmasq-dns/0.log" Feb 17 14:33:14 crc kubenswrapper[4804]: I0217 14:33:14.720543 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_cc2e7136-825b-4608-a106-944f359c7369/glance-log/0.log" Feb 17 14:33:14 crc kubenswrapper[4804]: I0217 14:33:14.734415 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_cc2e7136-825b-4608-a106-944f359c7369/glance-httpd/0.log" Feb 17 14:33:14 crc kubenswrapper[4804]: I0217 14:33:14.887264 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_52f268a5-3c72-4655-bb36-823c34e5312d/glance-httpd/0.log" Feb 17 14:33:14 crc kubenswrapper[4804]: I0217 14:33:14.936562 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_52f268a5-3c72-4655-bb36-823c34e5312d/glance-log/0.log" Feb 17 14:33:15 crc kubenswrapper[4804]: I0217 14:33:15.340277 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-9ffb6f5c6-fczv5_e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f/horizon/0.log" Feb 17 14:33:15 crc kubenswrapper[4804]: I0217 14:33:15.440547 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-65nc8_0a55b597-4920-4fa6-99d5-6deaa6f30a4a/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:33:15 crc kubenswrapper[4804]: I0217 14:33:15.654026 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-9ffb6f5c6-fczv5_e82dcc0e-38b4-43f0-a9e2-6e15915d2d0f/horizon-log/0.log" Feb 17 14:33:15 crc kubenswrapper[4804]: I0217 14:33:15.668117 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-hx4nm_e9b53a85-8a87-4b65-8832-00c4175da541/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:33:15 crc kubenswrapper[4804]: I0217 14:33:15.897265 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29522281-k9ptv_c2d1f319-5d08-4969-a968-45eba20958a7/keystone-cron/0.log" Feb 17 14:33:15 crc kubenswrapper[4804]: I0217 14:33:15.960096 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-9cc757857-wng6k_30df70d3-9323-4ddd-9d1c-2dae72cff6d9/keystone-api/0.log" Feb 17 14:33:16 crc kubenswrapper[4804]: I0217 14:33:16.081227 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_d6aabf20-b0bf-4f35-aec7-098f38bacfd9/kube-state-metrics/0.log" Feb 17 14:33:16 crc kubenswrapper[4804]: I0217 14:33:16.173039 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-nf5vc_c0aad2ba-98cf-42b5-9c03-40633fb8ac18/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:33:16 crc kubenswrapper[4804]: I0217 14:33:16.445596 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-c576cfd85-655nj_fb86b3d7-c6a3-43d5-a8da-805aa7d73a66/neutron-api/0.log" Feb 17 14:33:16 crc kubenswrapper[4804]: I0217 14:33:16.455952 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-c576cfd85-655nj_fb86b3d7-c6a3-43d5-a8da-805aa7d73a66/neutron-httpd/0.log" Feb 17 14:33:16 crc kubenswrapper[4804]: I0217 14:33:16.567112 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-66cmg_84938cd5-694c-423a-a0d1-801f28085377/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:33:17 crc kubenswrapper[4804]: I0217 14:33:17.115956 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_29528202-42d5-4bcd-90e8-335435ba59cf/nova-api-log/0.log" Feb 17 14:33:17 crc kubenswrapper[4804]: I0217 14:33:17.195362 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_fc78e86d-494e-417b-8569-b564cdbd069a/nova-cell0-conductor-conductor/0.log" Feb 17 14:33:17 crc kubenswrapper[4804]: I0217 14:33:17.435573 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_a13dbc73-75fc-448b-af44-cb7018d1640e/nova-cell1-conductor-conductor/0.log" Feb 17 14:33:17 crc kubenswrapper[4804]: I0217 14:33:17.569469 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_29528202-42d5-4bcd-90e8-335435ba59cf/nova-api-api/0.log" Feb 17 14:33:17 crc kubenswrapper[4804]: I0217 14:33:17.598924 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_5c380610-c164-4798-a5df-9b90fd475667/nova-cell1-novncproxy-novncproxy/0.log" Feb 17 14:33:17 crc kubenswrapper[4804]: I0217 14:33:17.711738 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-x8lml_9f17dd92-0402-40c7-bdc7-50b38e37f750/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:33:17 crc kubenswrapper[4804]: I0217 14:33:17.889627 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_ee4c15c1-5fb0-4605-9cb8-69a060ec0d39/nova-metadata-log/0.log" Feb 17 14:33:18 crc kubenswrapper[4804]: I0217 14:33:18.262543 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_1bac289d-58a7-4e23-8805-c48811d12d32/nova-scheduler-scheduler/0.log" Feb 17 14:33:18 crc kubenswrapper[4804]: I0217 14:33:18.299946 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f9eb8e8f-8bd1-4f69-84ee-27213046c709/mysql-bootstrap/0.log" Feb 17 14:33:18 crc kubenswrapper[4804]: I0217 14:33:18.457149 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f9eb8e8f-8bd1-4f69-84ee-27213046c709/mysql-bootstrap/0.log" Feb 17 14:33:18 crc kubenswrapper[4804]: I0217 14:33:18.508237 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f9eb8e8f-8bd1-4f69-84ee-27213046c709/galera/0.log" Feb 17 14:33:18 crc kubenswrapper[4804]: I0217 14:33:18.677692 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_49b02c8f-ff07-48f9-8012-e78dc6591499/mysql-bootstrap/0.log" Feb 17 14:33:18 crc kubenswrapper[4804]: I0217 14:33:18.853803 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_49b02c8f-ff07-48f9-8012-e78dc6591499/galera/0.log" Feb 17 14:33:18 crc kubenswrapper[4804]: I0217 14:33:18.858928 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_49b02c8f-ff07-48f9-8012-e78dc6591499/mysql-bootstrap/0.log" Feb 17 14:33:19 crc kubenswrapper[4804]: I0217 14:33:19.081493 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_de1a53e3-68ce-4ecd-9c0a-80ffce568891/openstackclient/0.log" Feb 17 14:33:19 crc kubenswrapper[4804]: I0217 14:33:19.135933 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-4s7l5_d286aa08-b0df-44e8-9128-f596f4b44db8/openstack-network-exporter/0.log" Feb 17 14:33:19 crc kubenswrapper[4804]: I0217 14:33:19.307808 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_ee4c15c1-5fb0-4605-9cb8-69a060ec0d39/nova-metadata-metadata/0.log" Feb 17 14:33:19 crc kubenswrapper[4804]: I0217 14:33:19.308242 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-p4wrm_45330d20-989c-4507-ae57-5beaee075484/ovsdb-server-init/0.log" Feb 17 14:33:19 crc kubenswrapper[4804]: I0217 14:33:19.509660 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-p4wrm_45330d20-989c-4507-ae57-5beaee075484/ovs-vswitchd/0.log" Feb 17 14:33:19 crc kubenswrapper[4804]: I0217 14:33:19.526431 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-p4wrm_45330d20-989c-4507-ae57-5beaee075484/ovsdb-server-init/0.log" Feb 17 14:33:19 crc kubenswrapper[4804]: I0217 14:33:19.612153 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-p4wrm_45330d20-989c-4507-ae57-5beaee075484/ovsdb-server/0.log" Feb 17 14:33:19 crc kubenswrapper[4804]: I0217 14:33:19.758819 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-rzcfd_9c049787-03d2-4679-8705-ec2cd1ad8141/ovn-controller/0.log" Feb 17 14:33:19 crc kubenswrapper[4804]: I0217 14:33:19.837769 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-v478m_be98213b-0510-4f69-9d98-81363c04d8bd/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:33:19 crc kubenswrapper[4804]: I0217 14:33:19.963767 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3e322ccb-33cf-466f-91fb-63781bdcffb6/openstack-network-exporter/0.log" Feb 17 14:33:19 crc kubenswrapper[4804]: I0217 14:33:19.998754 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3e322ccb-33cf-466f-91fb-63781bdcffb6/ovn-northd/0.log" Feb 17 14:33:20 crc kubenswrapper[4804]: I0217 14:33:20.116753 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0fc5c8da-b323-4afb-aa47-125fc63caefd/openstack-network-exporter/0.log" Feb 17 14:33:20 crc kubenswrapper[4804]: I0217 14:33:20.199384 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0fc5c8da-b323-4afb-aa47-125fc63caefd/ovsdbserver-nb/0.log" Feb 17 14:33:20 crc kubenswrapper[4804]: I0217 14:33:20.328834 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_10e1124a-f402-422d-a906-8d22c90d4abe/ovsdbserver-sb/0.log" Feb 17 14:33:20 crc kubenswrapper[4804]: I0217 14:33:20.366065 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_10e1124a-f402-422d-a906-8d22c90d4abe/openstack-network-exporter/0.log" Feb 17 14:33:20 crc kubenswrapper[4804]: I0217 14:33:20.522635 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6d69649784-lnwhw_858d67cb-268b-4724-bba9-a7ab9a10ed6c/placement-api/0.log" Feb 17 14:33:20 crc kubenswrapper[4804]: I0217 14:33:20.616413 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6d69649784-lnwhw_858d67cb-268b-4724-bba9-a7ab9a10ed6c/placement-log/0.log" Feb 17 14:33:20 crc kubenswrapper[4804]: I0217 14:33:20.686122 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4c7ecd09-cd15-439d-9153-b55d9013bb83/setup-container/0.log" Feb 17 14:33:20 crc kubenswrapper[4804]: I0217 14:33:20.887351 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b5f204e4-3b7a-4490-9c78-def5bf30f810/setup-container/0.log" Feb 17 14:33:20 crc kubenswrapper[4804]: I0217 14:33:20.892426 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4c7ecd09-cd15-439d-9153-b55d9013bb83/setup-container/0.log" Feb 17 14:33:20 crc kubenswrapper[4804]: I0217 14:33:20.927637 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4c7ecd09-cd15-439d-9153-b55d9013bb83/rabbitmq/0.log" Feb 17 14:33:21 crc kubenswrapper[4804]: I0217 14:33:21.097575 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b5f204e4-3b7a-4490-9c78-def5bf30f810/rabbitmq/0.log" Feb 17 14:33:21 crc kubenswrapper[4804]: I0217 14:33:21.176431 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b5f204e4-3b7a-4490-9c78-def5bf30f810/setup-container/0.log" Feb 17 14:33:21 crc kubenswrapper[4804]: I0217 14:33:21.223207 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-lbd66_100d84c5-396c-4772-af09-2e223e72a640/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:33:21 crc kubenswrapper[4804]: I0217 14:33:21.963109 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-z6s9f_c87b0376-c505-452b-90ed-0e6bb7e6e8e0/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:33:22 crc kubenswrapper[4804]: I0217 14:33:22.003083 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-zctst_ec9c23b1-5a00-45c9-bcbe-e23c629c3bcd/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:33:22 crc kubenswrapper[4804]: I0217 14:33:22.156449 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-rf97c_01fe0e44-6604-4e17-bcb4-05f202508fc7/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:33:22 crc kubenswrapper[4804]: I0217 14:33:22.246786 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-9jrnh_cdb9b3eb-f3d1-4a32-8a87-b0f686cad260/ssh-known-hosts-edpm-deployment/0.log" Feb 17 14:33:22 crc kubenswrapper[4804]: I0217 14:33:22.477107 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-59cfdfc65f-48l6n_be0372d3-4646-46e7-af04-6977a7426f35/proxy-server/0.log" Feb 17 14:33:22 crc kubenswrapper[4804]: I0217 14:33:22.507040 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-59cfdfc65f-48l6n_be0372d3-4646-46e7-af04-6977a7426f35/proxy-httpd/0.log" Feb 17 14:33:22 crc kubenswrapper[4804]: I0217 14:33:22.548368 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-mv8w5_41aa78f0-ef58-4a36-b1f9-ce222fd8e1e2/swift-ring-rebalance/0.log" Feb 17 14:33:22 crc kubenswrapper[4804]: I0217 14:33:22.734874 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/account-reaper/0.log" Feb 17 14:33:22 crc kubenswrapper[4804]: I0217 14:33:22.752412 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/account-auditor/0.log" Feb 17 14:33:22 crc kubenswrapper[4804]: I0217 14:33:22.820340 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/account-replicator/0.log" Feb 17 14:33:22 crc kubenswrapper[4804]: I0217 14:33:22.948017 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/container-auditor/0.log" Feb 17 14:33:22 crc kubenswrapper[4804]: I0217 14:33:22.950360 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/account-server/0.log" Feb 17 14:33:22 crc kubenswrapper[4804]: I0217 14:33:22.999258 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/container-replicator/0.log" Feb 17 14:33:23 crc kubenswrapper[4804]: I0217 14:33:23.048125 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/container-server/0.log" Feb 17 14:33:23 crc kubenswrapper[4804]: I0217 14:33:23.156246 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/object-auditor/0.log" Feb 17 14:33:23 crc kubenswrapper[4804]: I0217 14:33:23.156847 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/container-updater/0.log" Feb 17 14:33:23 crc kubenswrapper[4804]: I0217 14:33:23.210485 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/object-expirer/0.log" Feb 17 14:33:23 crc kubenswrapper[4804]: I0217 14:33:23.776137 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/object-server/0.log" Feb 17 14:33:23 crc kubenswrapper[4804]: I0217 14:33:23.780609 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/object-updater/0.log" Feb 17 14:33:23 crc kubenswrapper[4804]: I0217 14:33:23.784973 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/object-replicator/0.log" Feb 17 14:33:23 crc kubenswrapper[4804]: I0217 14:33:23.800300 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/rsync/0.log" Feb 17 14:33:24 crc kubenswrapper[4804]: I0217 14:33:24.002428 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_90da6e89-6033-4e42-a5ca-bed1a5ad6a46/swift-recon-cron/0.log" Feb 17 14:33:24 crc kubenswrapper[4804]: I0217 14:33:24.077451 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-wtq55_0b8bd88a-7a73-4cd4-8be6-e4adb201bfc7/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:33:24 crc kubenswrapper[4804]: I0217 14:33:24.225509 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_f7b246dc-1d07-4725-b471-88fe82584d24/tempest-tests-tempest-tests-runner/0.log" Feb 17 14:33:24 crc kubenswrapper[4804]: I0217 14:33:24.269376 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_4c6dcbcb-8248-40b5-8fd6-7824c487109e/test-operator-logs-container/0.log" Feb 17 14:33:24 crc kubenswrapper[4804]: I0217 14:33:24.485339 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-bt6vb_ed6642bc-b49f-4e17-a721-b3eae09246aa/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 14:33:29 crc kubenswrapper[4804]: I0217 14:33:29.575193 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:33:29 crc kubenswrapper[4804]: E0217 14:33:29.576676 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:33:35 crc kubenswrapper[4804]: I0217 14:33:35.286296 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5z45j"] Feb 17 14:33:35 crc kubenswrapper[4804]: E0217 14:33:35.292443 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01160288-3510-4001-8a02-c356f2b354f1" containerName="container-00" Feb 17 14:33:35 crc kubenswrapper[4804]: I0217 14:33:35.292470 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="01160288-3510-4001-8a02-c356f2b354f1" containerName="container-00" Feb 17 14:33:35 crc kubenswrapper[4804]: I0217 14:33:35.292729 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="01160288-3510-4001-8a02-c356f2b354f1" containerName="container-00" Feb 17 14:33:35 crc kubenswrapper[4804]: I0217 14:33:35.294685 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5z45j" Feb 17 14:33:35 crc kubenswrapper[4804]: I0217 14:33:35.301649 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5z45j"] Feb 17 14:33:35 crc kubenswrapper[4804]: I0217 14:33:35.388744 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ee4631f-2436-4b96-bb8c-4137382e12aa-catalog-content\") pod \"redhat-marketplace-5z45j\" (UID: \"9ee4631f-2436-4b96-bb8c-4137382e12aa\") " pod="openshift-marketplace/redhat-marketplace-5z45j" Feb 17 14:33:35 crc kubenswrapper[4804]: I0217 14:33:35.388794 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ee4631f-2436-4b96-bb8c-4137382e12aa-utilities\") pod \"redhat-marketplace-5z45j\" (UID: \"9ee4631f-2436-4b96-bb8c-4137382e12aa\") " pod="openshift-marketplace/redhat-marketplace-5z45j" Feb 17 14:33:35 crc kubenswrapper[4804]: I0217 14:33:35.388813 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txwcj\" (UniqueName: \"kubernetes.io/projected/9ee4631f-2436-4b96-bb8c-4137382e12aa-kube-api-access-txwcj\") pod \"redhat-marketplace-5z45j\" (UID: \"9ee4631f-2436-4b96-bb8c-4137382e12aa\") " pod="openshift-marketplace/redhat-marketplace-5z45j" Feb 17 14:33:35 crc kubenswrapper[4804]: I0217 14:33:35.427888 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_f5ef96d0-19a6-4561-bde2-cf38e0280b39/memcached/0.log" Feb 17 14:33:35 crc kubenswrapper[4804]: I0217 14:33:35.490979 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ee4631f-2436-4b96-bb8c-4137382e12aa-catalog-content\") pod \"redhat-marketplace-5z45j\" (UID: \"9ee4631f-2436-4b96-bb8c-4137382e12aa\") " pod="openshift-marketplace/redhat-marketplace-5z45j" Feb 17 14:33:35 crc kubenswrapper[4804]: I0217 14:33:35.491058 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ee4631f-2436-4b96-bb8c-4137382e12aa-utilities\") pod \"redhat-marketplace-5z45j\" (UID: \"9ee4631f-2436-4b96-bb8c-4137382e12aa\") " pod="openshift-marketplace/redhat-marketplace-5z45j" Feb 17 14:33:35 crc kubenswrapper[4804]: I0217 14:33:35.491085 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txwcj\" (UniqueName: \"kubernetes.io/projected/9ee4631f-2436-4b96-bb8c-4137382e12aa-kube-api-access-txwcj\") pod \"redhat-marketplace-5z45j\" (UID: \"9ee4631f-2436-4b96-bb8c-4137382e12aa\") " pod="openshift-marketplace/redhat-marketplace-5z45j" Feb 17 14:33:35 crc kubenswrapper[4804]: I0217 14:33:35.491663 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ee4631f-2436-4b96-bb8c-4137382e12aa-catalog-content\") pod \"redhat-marketplace-5z45j\" (UID: \"9ee4631f-2436-4b96-bb8c-4137382e12aa\") " pod="openshift-marketplace/redhat-marketplace-5z45j" Feb 17 14:33:35 crc kubenswrapper[4804]: I0217 14:33:35.491818 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ee4631f-2436-4b96-bb8c-4137382e12aa-utilities\") pod \"redhat-marketplace-5z45j\" (UID: \"9ee4631f-2436-4b96-bb8c-4137382e12aa\") " pod="openshift-marketplace/redhat-marketplace-5z45j" Feb 17 14:33:35 crc kubenswrapper[4804]: I0217 14:33:35.529091 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txwcj\" (UniqueName: \"kubernetes.io/projected/9ee4631f-2436-4b96-bb8c-4137382e12aa-kube-api-access-txwcj\") pod \"redhat-marketplace-5z45j\" (UID: \"9ee4631f-2436-4b96-bb8c-4137382e12aa\") " pod="openshift-marketplace/redhat-marketplace-5z45j" Feb 17 14:33:35 crc kubenswrapper[4804]: I0217 14:33:35.651385 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5z45j" Feb 17 14:33:36 crc kubenswrapper[4804]: I0217 14:33:36.165166 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5z45j"] Feb 17 14:33:36 crc kubenswrapper[4804]: I0217 14:33:36.581387 4804 generic.go:334] "Generic (PLEG): container finished" podID="9ee4631f-2436-4b96-bb8c-4137382e12aa" containerID="15694f37ee9b5bb7a000790ce1d88106ede67c90ce06bde635c4327ae7320ea7" exitCode=0 Feb 17 14:33:36 crc kubenswrapper[4804]: I0217 14:33:36.582625 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5z45j" event={"ID":"9ee4631f-2436-4b96-bb8c-4137382e12aa","Type":"ContainerDied","Data":"15694f37ee9b5bb7a000790ce1d88106ede67c90ce06bde635c4327ae7320ea7"} Feb 17 14:33:36 crc kubenswrapper[4804]: I0217 14:33:36.582659 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5z45j" event={"ID":"9ee4631f-2436-4b96-bb8c-4137382e12aa","Type":"ContainerStarted","Data":"d0388e8ed5eb0bf260d3d3be7512607aa673bcc17891d6b49bd3ddf27df1381b"} Feb 17 14:33:37 crc kubenswrapper[4804]: I0217 14:33:37.594224 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5z45j" event={"ID":"9ee4631f-2436-4b96-bb8c-4137382e12aa","Type":"ContainerStarted","Data":"082b8a4b1004bf301dc5d2aa3c124a734e32ac8a029f0331e6f1751d57c6ee65"} Feb 17 14:33:38 crc kubenswrapper[4804]: I0217 14:33:38.605376 4804 generic.go:334] "Generic (PLEG): container finished" podID="9ee4631f-2436-4b96-bb8c-4137382e12aa" containerID="082b8a4b1004bf301dc5d2aa3c124a734e32ac8a029f0331e6f1751d57c6ee65" exitCode=0 Feb 17 14:33:38 crc kubenswrapper[4804]: I0217 14:33:38.605492 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5z45j" event={"ID":"9ee4631f-2436-4b96-bb8c-4137382e12aa","Type":"ContainerDied","Data":"082b8a4b1004bf301dc5d2aa3c124a734e32ac8a029f0331e6f1751d57c6ee65"} Feb 17 14:33:40 crc kubenswrapper[4804]: I0217 14:33:40.625464 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5z45j" event={"ID":"9ee4631f-2436-4b96-bb8c-4137382e12aa","Type":"ContainerStarted","Data":"874328d39915119650be694f68cfbee0fc608658ffcd821e02d467aba49cfbba"} Feb 17 14:33:40 crc kubenswrapper[4804]: I0217 14:33:40.645797 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5z45j" podStartSLOduration=3.246874107 podStartE2EDuration="5.645777225s" podCreationTimestamp="2026-02-17 14:33:35 +0000 UTC" firstStartedPulling="2026-02-17 14:33:36.582769929 +0000 UTC m=+4090.694189256" lastFinishedPulling="2026-02-17 14:33:38.981673037 +0000 UTC m=+4093.093092374" observedRunningTime="2026-02-17 14:33:40.640340865 +0000 UTC m=+4094.751760202" watchObservedRunningTime="2026-02-17 14:33:40.645777225 +0000 UTC m=+4094.757196562" Feb 17 14:33:43 crc kubenswrapper[4804]: I0217 14:33:43.574436 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:33:43 crc kubenswrapper[4804]: E0217 14:33:43.575532 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:33:45 crc kubenswrapper[4804]: I0217 14:33:45.652393 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5z45j" Feb 17 14:33:45 crc kubenswrapper[4804]: I0217 14:33:45.652710 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5z45j" Feb 17 14:33:45 crc kubenswrapper[4804]: I0217 14:33:45.706726 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5z45j" Feb 17 14:33:45 crc kubenswrapper[4804]: I0217 14:33:45.750321 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5z45j" Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.057834 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5z45j"] Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.059314 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5z45j" podUID="9ee4631f-2436-4b96-bb8c-4137382e12aa" containerName="registry-server" containerID="cri-o://874328d39915119650be694f68cfbee0fc608658ffcd821e02d467aba49cfbba" gracePeriod=2 Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.565402 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5z45j" Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.688657 4804 generic.go:334] "Generic (PLEG): container finished" podID="9ee4631f-2436-4b96-bb8c-4137382e12aa" containerID="874328d39915119650be694f68cfbee0fc608658ffcd821e02d467aba49cfbba" exitCode=0 Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.688695 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5z45j" event={"ID":"9ee4631f-2436-4b96-bb8c-4137382e12aa","Type":"ContainerDied","Data":"874328d39915119650be694f68cfbee0fc608658ffcd821e02d467aba49cfbba"} Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.688724 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5z45j" Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.689034 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5z45j" event={"ID":"9ee4631f-2436-4b96-bb8c-4137382e12aa","Type":"ContainerDied","Data":"d0388e8ed5eb0bf260d3d3be7512607aa673bcc17891d6b49bd3ddf27df1381b"} Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.689060 4804 scope.go:117] "RemoveContainer" containerID="874328d39915119650be694f68cfbee0fc608658ffcd821e02d467aba49cfbba" Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.720616 4804 scope.go:117] "RemoveContainer" containerID="082b8a4b1004bf301dc5d2aa3c124a734e32ac8a029f0331e6f1751d57c6ee65" Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.738489 4804 scope.go:117] "RemoveContainer" containerID="15694f37ee9b5bb7a000790ce1d88106ede67c90ce06bde635c4327ae7320ea7" Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.752000 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ee4631f-2436-4b96-bb8c-4137382e12aa-catalog-content\") pod \"9ee4631f-2436-4b96-bb8c-4137382e12aa\" (UID: \"9ee4631f-2436-4b96-bb8c-4137382e12aa\") " Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.752154 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ee4631f-2436-4b96-bb8c-4137382e12aa-utilities\") pod \"9ee4631f-2436-4b96-bb8c-4137382e12aa\" (UID: \"9ee4631f-2436-4b96-bb8c-4137382e12aa\") " Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.752186 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txwcj\" (UniqueName: \"kubernetes.io/projected/9ee4631f-2436-4b96-bb8c-4137382e12aa-kube-api-access-txwcj\") pod \"9ee4631f-2436-4b96-bb8c-4137382e12aa\" (UID: \"9ee4631f-2436-4b96-bb8c-4137382e12aa\") " Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.753029 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ee4631f-2436-4b96-bb8c-4137382e12aa-utilities" (OuterVolumeSpecName: "utilities") pod "9ee4631f-2436-4b96-bb8c-4137382e12aa" (UID: "9ee4631f-2436-4b96-bb8c-4137382e12aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.758367 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ee4631f-2436-4b96-bb8c-4137382e12aa-kube-api-access-txwcj" (OuterVolumeSpecName: "kube-api-access-txwcj") pod "9ee4631f-2436-4b96-bb8c-4137382e12aa" (UID: "9ee4631f-2436-4b96-bb8c-4137382e12aa"). InnerVolumeSpecName "kube-api-access-txwcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.779895 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ee4631f-2436-4b96-bb8c-4137382e12aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9ee4631f-2436-4b96-bb8c-4137382e12aa" (UID: "9ee4631f-2436-4b96-bb8c-4137382e12aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.831073 4804 scope.go:117] "RemoveContainer" containerID="874328d39915119650be694f68cfbee0fc608658ffcd821e02d467aba49cfbba" Feb 17 14:33:48 crc kubenswrapper[4804]: E0217 14:33:48.831360 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"874328d39915119650be694f68cfbee0fc608658ffcd821e02d467aba49cfbba\": container with ID starting with 874328d39915119650be694f68cfbee0fc608658ffcd821e02d467aba49cfbba not found: ID does not exist" containerID="874328d39915119650be694f68cfbee0fc608658ffcd821e02d467aba49cfbba" Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.831407 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"874328d39915119650be694f68cfbee0fc608658ffcd821e02d467aba49cfbba"} err="failed to get container status \"874328d39915119650be694f68cfbee0fc608658ffcd821e02d467aba49cfbba\": rpc error: code = NotFound desc = could not find container \"874328d39915119650be694f68cfbee0fc608658ffcd821e02d467aba49cfbba\": container with ID starting with 874328d39915119650be694f68cfbee0fc608658ffcd821e02d467aba49cfbba not found: ID does not exist" Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.831427 4804 scope.go:117] "RemoveContainer" containerID="082b8a4b1004bf301dc5d2aa3c124a734e32ac8a029f0331e6f1751d57c6ee65" Feb 17 14:33:48 crc kubenswrapper[4804]: E0217 14:33:48.831789 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"082b8a4b1004bf301dc5d2aa3c124a734e32ac8a029f0331e6f1751d57c6ee65\": container with ID starting with 082b8a4b1004bf301dc5d2aa3c124a734e32ac8a029f0331e6f1751d57c6ee65 not found: ID does not exist" containerID="082b8a4b1004bf301dc5d2aa3c124a734e32ac8a029f0331e6f1751d57c6ee65" Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.831842 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"082b8a4b1004bf301dc5d2aa3c124a734e32ac8a029f0331e6f1751d57c6ee65"} err="failed to get container status \"082b8a4b1004bf301dc5d2aa3c124a734e32ac8a029f0331e6f1751d57c6ee65\": rpc error: code = NotFound desc = could not find container \"082b8a4b1004bf301dc5d2aa3c124a734e32ac8a029f0331e6f1751d57c6ee65\": container with ID starting with 082b8a4b1004bf301dc5d2aa3c124a734e32ac8a029f0331e6f1751d57c6ee65 not found: ID does not exist" Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.831876 4804 scope.go:117] "RemoveContainer" containerID="15694f37ee9b5bb7a000790ce1d88106ede67c90ce06bde635c4327ae7320ea7" Feb 17 14:33:48 crc kubenswrapper[4804]: E0217 14:33:48.832218 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15694f37ee9b5bb7a000790ce1d88106ede67c90ce06bde635c4327ae7320ea7\": container with ID starting with 15694f37ee9b5bb7a000790ce1d88106ede67c90ce06bde635c4327ae7320ea7 not found: ID does not exist" containerID="15694f37ee9b5bb7a000790ce1d88106ede67c90ce06bde635c4327ae7320ea7" Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.832250 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15694f37ee9b5bb7a000790ce1d88106ede67c90ce06bde635c4327ae7320ea7"} err="failed to get container status \"15694f37ee9b5bb7a000790ce1d88106ede67c90ce06bde635c4327ae7320ea7\": rpc error: code = NotFound desc = could not find container \"15694f37ee9b5bb7a000790ce1d88106ede67c90ce06bde635c4327ae7320ea7\": container with ID starting with 15694f37ee9b5bb7a000790ce1d88106ede67c90ce06bde635c4327ae7320ea7 not found: ID does not exist" Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.854278 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ee4631f-2436-4b96-bb8c-4137382e12aa-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.854315 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ee4631f-2436-4b96-bb8c-4137382e12aa-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:48 crc kubenswrapper[4804]: I0217 14:33:48.854332 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txwcj\" (UniqueName: \"kubernetes.io/projected/9ee4631f-2436-4b96-bb8c-4137382e12aa-kube-api-access-txwcj\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:49 crc kubenswrapper[4804]: I0217 14:33:49.018811 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5z45j"] Feb 17 14:33:49 crc kubenswrapper[4804]: I0217 14:33:49.027809 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5z45j"] Feb 17 14:33:50 crc kubenswrapper[4804]: I0217 14:33:50.599337 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ee4631f-2436-4b96-bb8c-4137382e12aa" path="/var/lib/kubelet/pods/9ee4631f-2436-4b96-bb8c-4137382e12aa/volumes" Feb 17 14:33:54 crc kubenswrapper[4804]: I0217 14:33:54.186847 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq_fc2739bc-c729-4c9f-856b-9a08143fc359/util/0.log" Feb 17 14:33:54 crc kubenswrapper[4804]: I0217 14:33:54.385395 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq_fc2739bc-c729-4c9f-856b-9a08143fc359/pull/0.log" Feb 17 14:33:54 crc kubenswrapper[4804]: I0217 14:33:54.396024 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq_fc2739bc-c729-4c9f-856b-9a08143fc359/pull/0.log" Feb 17 14:33:54 crc kubenswrapper[4804]: I0217 14:33:54.412792 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq_fc2739bc-c729-4c9f-856b-9a08143fc359/util/0.log" Feb 17 14:33:54 crc kubenswrapper[4804]: I0217 14:33:54.624485 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq_fc2739bc-c729-4c9f-856b-9a08143fc359/pull/0.log" Feb 17 14:33:54 crc kubenswrapper[4804]: I0217 14:33:54.624518 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq_fc2739bc-c729-4c9f-856b-9a08143fc359/extract/0.log" Feb 17 14:33:54 crc kubenswrapper[4804]: I0217 14:33:54.647093 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69fd5f937ff344d5ad8963bc66e11c74507d0ab75326b7adb48a42ae68mbxjq_fc2739bc-c729-4c9f-856b-9a08143fc359/util/0.log" Feb 17 14:33:55 crc kubenswrapper[4804]: I0217 14:33:55.071478 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-55cc45767f-bslfv_fbc5e6cd-47c6-4199-a0f2-e4292a836fac/manager/0.log" Feb 17 14:33:55 crc kubenswrapper[4804]: I0217 14:33:55.434693 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-68c6d499cb-vt6zw_5796dc62-bd84-48b7-9c4c-7d5bf1f7e984/manager/0.log" Feb 17 14:33:55 crc kubenswrapper[4804]: I0217 14:33:55.659867 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-9595d6797-sxtr2_5727ae12-4720-4470-b5cc-8b8ae81c2af7/manager/0.log" Feb 17 14:33:55 crc kubenswrapper[4804]: I0217 14:33:55.901291 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-54fb488b88-t6hlr_5fa66dc5-a518-40dd-a4b5-dd2b34425ad5/manager/0.log" Feb 17 14:33:56 crc kubenswrapper[4804]: I0217 14:33:56.344294 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6494cdbf8f-cdpkr_07b97973-fa08-4b79-9164-918a4d04f8b7/manager/0.log" Feb 17 14:33:56 crc kubenswrapper[4804]: I0217 14:33:56.528752 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-66d6b5f488-lrjgg_bf13099a-fbab-41bf-b30c-5c6b1049af19/manager/0.log" Feb 17 14:33:57 crc kubenswrapper[4804]: I0217 14:33:57.017025 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-6c78d668d5-pddsh_430279ab-ba2f-4838-ab07-b851d4df84a0/manager/0.log" Feb 17 14:33:57 crc kubenswrapper[4804]: I0217 14:33:57.192269 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-57746b5ff9-wn64m_0b746a42-c0b4-4cb9-9352-3623669bad5a/manager/0.log" Feb 17 14:33:57 crc kubenswrapper[4804]: I0217 14:33:57.278019 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-96fff9cb8-88sh4_d3332002-6930-418f-8288-e8344be70c6a/manager/0.log" Feb 17 14:33:57 crc kubenswrapper[4804]: I0217 14:33:57.426114 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-66997756f6-vkdg2_2546387a-6a42-4f8d-a321-2f9cbaa11adb/manager/0.log" Feb 17 14:33:57 crc kubenswrapper[4804]: I0217 14:33:57.491746 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54967dbbdf-l5cl2_97925efc-eb46-4a60-b372-b31f13a2c876/manager/0.log" Feb 17 14:33:57 crc kubenswrapper[4804]: I0217 14:33:57.760190 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5ddd85db87-c8hmm_36b1ca46-becb-417e-b05e-777d40246cb6/manager/0.log" Feb 17 14:33:58 crc kubenswrapper[4804]: I0217 14:33:58.067296 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-c5677dc5d-6zv88_ae7598b8-fff5-4044-bbd7-0c8f2f60eed8/manager/0.log" Feb 17 14:33:58 crc kubenswrapper[4804]: I0217 14:33:58.442711 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-7cb8c4979f-kfx9x_f69fc148-3a8b-4065-b075-85ecad8339e7/operator/0.log" Feb 17 14:33:58 crc kubenswrapper[4804]: I0217 14:33:58.578879 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:33:58 crc kubenswrapper[4804]: E0217 14:33:58.579189 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:33:58 crc kubenswrapper[4804]: I0217 14:33:58.772046 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-55nc6_13d9e436-3cb0-4df0-aaf9-e614eba74c89/registry-server/0.log" Feb 17 14:33:59 crc kubenswrapper[4804]: I0217 14:33:59.507289 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-85c99d655-ltwrc_ac1e20c8-4527-4bba-85bd-2154e1244d3e/manager/0.log" Feb 17 14:33:59 crc kubenswrapper[4804]: I0217 14:33:59.638465 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-57bd55f9b7-9vbg5_42505b9c-f878-4feb-b9a1-9dfa11ec0f56/manager/0.log" Feb 17 14:33:59 crc kubenswrapper[4804]: I0217 14:33:59.898896 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-rtlpm_44ec973d-9403-48f4-b92c-72f0bd485b0f/operator/0.log" Feb 17 14:34:00 crc kubenswrapper[4804]: I0217 14:34:00.115035 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-79558bbfbf-n6fl9_f94e791f-16fd-4364-a246-35bcca0d14e6/manager/0.log" Feb 17 14:34:00 crc kubenswrapper[4804]: I0217 14:34:00.454973 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-56dc67d744-rbrxl_067b67c8-64c5-4c21-b1b1-770aa68e0eb7/manager/0.log" Feb 17 14:34:00 crc kubenswrapper[4804]: I0217 14:34:00.533438 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-745bbbd77b-ptrs5_79eb8fb0-6207-44c8-b3c2-a00116bcf10b/manager/0.log" Feb 17 14:34:00 crc kubenswrapper[4804]: I0217 14:34:00.557376 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5744df64c-mkkrv_8155784a-3945-4ca3-aa9a-b0e089ffac52/manager/0.log" Feb 17 14:34:00 crc kubenswrapper[4804]: I0217 14:34:00.672503 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-8467ccb4c8-nwmk5_1c7ad838-6225-4001-899a-7f741cb75f2f/manager/0.log" Feb 17 14:34:01 crc kubenswrapper[4804]: I0217 14:34:01.300863 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c469bc6bb-xlwmb_57038414-fcca-4a2a-8756-46f97cc57d81/manager/0.log" Feb 17 14:34:05 crc kubenswrapper[4804]: I0217 14:34:05.472573 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-c4b7d6946-4xvfg_545c7d25-7774-4c62-89b8-f491fd4065e8/manager/0.log" Feb 17 14:34:11 crc kubenswrapper[4804]: I0217 14:34:11.574140 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:34:11 crc kubenswrapper[4804]: E0217 14:34:11.574860 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:34:23 crc kubenswrapper[4804]: I0217 14:34:23.528700 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-t4m4g_6c98dfab-f166-4eb4-b385-724d6b9b9d7a/control-plane-machine-set-operator/0.log" Feb 17 14:34:23 crc kubenswrapper[4804]: I0217 14:34:23.574447 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:34:23 crc kubenswrapper[4804]: E0217 14:34:23.574760 4804 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7c5_openshift-machine-config-operator(6992e22f-b963-46fc-ac41-4ca9938dda85)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" Feb 17 14:34:23 crc kubenswrapper[4804]: I0217 14:34:23.592904 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-spfls_17c8a131-fc0e-44b5-b374-846e6b2aeb1c/kube-rbac-proxy/0.log" Feb 17 14:34:23 crc kubenswrapper[4804]: I0217 14:34:23.701309 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-spfls_17c8a131-fc0e-44b5-b374-846e6b2aeb1c/machine-api-operator/0.log" Feb 17 14:34:35 crc kubenswrapper[4804]: I0217 14:34:35.514262 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-7sfkb_112c357f-f1dc-4a07-bba0-ddf54ab071ff/cert-manager-controller/0.log" Feb 17 14:34:35 crc kubenswrapper[4804]: I0217 14:34:35.766929 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-kbdz5_9d2d8008-6348-4f24-8085-d30db8558ab3/cert-manager-cainjector/0.log" Feb 17 14:34:35 crc kubenswrapper[4804]: I0217 14:34:35.787583 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-c8nh8_be70f757-4537-489d-a86e-a1b49fc9af75/cert-manager-webhook/0.log" Feb 17 14:34:36 crc kubenswrapper[4804]: I0217 14:34:36.580364 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:34:37 crc kubenswrapper[4804]: I0217 14:34:37.164795 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerStarted","Data":"9de6f8932aa6eb9745d883c27729ffc7b5e517e2da23231504aaed3b733a9ff5"} Feb 17 14:34:49 crc kubenswrapper[4804]: I0217 14:34:49.795230 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-bgf7w_2158c202-5aa4-47aa-87a1-73e4b9043e78/nmstate-console-plugin/0.log" Feb 17 14:34:49 crc kubenswrapper[4804]: I0217 14:34:49.998177 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-jxn7r_81e46a71-360c-4509-ad38-2b2c814a56c2/nmstate-handler/0.log" Feb 17 14:34:50 crc kubenswrapper[4804]: I0217 14:34:50.028612 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-8gkbz_18e3c061-8633-471f-b2ab-e87e3c0b5d44/nmstate-metrics/0.log" Feb 17 14:34:50 crc kubenswrapper[4804]: I0217 14:34:50.034566 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-8gkbz_18e3c061-8633-471f-b2ab-e87e3c0b5d44/kube-rbac-proxy/0.log" Feb 17 14:34:50 crc kubenswrapper[4804]: I0217 14:34:50.216363 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-rkf7s_2789dcb9-5619-4986-a692-1eec733c97ff/nmstate-operator/0.log" Feb 17 14:34:50 crc kubenswrapper[4804]: I0217 14:34:50.236817 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-dbfqz_36fd4ae3-048e-4e51-b2fa-875a5c84b8e0/nmstate-webhook/0.log" Feb 17 14:35:01 crc kubenswrapper[4804]: I0217 14:35:01.315935 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mxm8x"] Feb 17 14:35:01 crc kubenswrapper[4804]: E0217 14:35:01.317170 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ee4631f-2436-4b96-bb8c-4137382e12aa" containerName="extract-utilities" Feb 17 14:35:01 crc kubenswrapper[4804]: I0217 14:35:01.317186 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ee4631f-2436-4b96-bb8c-4137382e12aa" containerName="extract-utilities" Feb 17 14:35:01 crc kubenswrapper[4804]: E0217 14:35:01.317217 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ee4631f-2436-4b96-bb8c-4137382e12aa" containerName="extract-content" Feb 17 14:35:01 crc kubenswrapper[4804]: I0217 14:35:01.317225 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ee4631f-2436-4b96-bb8c-4137382e12aa" containerName="extract-content" Feb 17 14:35:01 crc kubenswrapper[4804]: E0217 14:35:01.317261 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ee4631f-2436-4b96-bb8c-4137382e12aa" containerName="registry-server" Feb 17 14:35:01 crc kubenswrapper[4804]: I0217 14:35:01.317269 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ee4631f-2436-4b96-bb8c-4137382e12aa" containerName="registry-server" Feb 17 14:35:01 crc kubenswrapper[4804]: I0217 14:35:01.317526 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ee4631f-2436-4b96-bb8c-4137382e12aa" containerName="registry-server" Feb 17 14:35:01 crc kubenswrapper[4804]: I0217 14:35:01.319160 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxm8x" Feb 17 14:35:01 crc kubenswrapper[4804]: I0217 14:35:01.327837 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mxm8x"] Feb 17 14:35:01 crc kubenswrapper[4804]: I0217 14:35:01.411514 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2zd8\" (UniqueName: \"kubernetes.io/projected/2a8f57b2-1e50-4720-b9ec-832cc2e41c21-kube-api-access-j2zd8\") pod \"community-operators-mxm8x\" (UID: \"2a8f57b2-1e50-4720-b9ec-832cc2e41c21\") " pod="openshift-marketplace/community-operators-mxm8x" Feb 17 14:35:01 crc kubenswrapper[4804]: I0217 14:35:01.411622 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a8f57b2-1e50-4720-b9ec-832cc2e41c21-catalog-content\") pod \"community-operators-mxm8x\" (UID: \"2a8f57b2-1e50-4720-b9ec-832cc2e41c21\") " pod="openshift-marketplace/community-operators-mxm8x" Feb 17 14:35:01 crc kubenswrapper[4804]: I0217 14:35:01.411930 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a8f57b2-1e50-4720-b9ec-832cc2e41c21-utilities\") pod \"community-operators-mxm8x\" (UID: \"2a8f57b2-1e50-4720-b9ec-832cc2e41c21\") " pod="openshift-marketplace/community-operators-mxm8x" Feb 17 14:35:01 crc kubenswrapper[4804]: I0217 14:35:01.514122 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2zd8\" (UniqueName: \"kubernetes.io/projected/2a8f57b2-1e50-4720-b9ec-832cc2e41c21-kube-api-access-j2zd8\") pod \"community-operators-mxm8x\" (UID: \"2a8f57b2-1e50-4720-b9ec-832cc2e41c21\") " pod="openshift-marketplace/community-operators-mxm8x" Feb 17 14:35:01 crc kubenswrapper[4804]: I0217 14:35:01.514171 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a8f57b2-1e50-4720-b9ec-832cc2e41c21-catalog-content\") pod \"community-operators-mxm8x\" (UID: \"2a8f57b2-1e50-4720-b9ec-832cc2e41c21\") " pod="openshift-marketplace/community-operators-mxm8x" Feb 17 14:35:01 crc kubenswrapper[4804]: I0217 14:35:01.514258 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a8f57b2-1e50-4720-b9ec-832cc2e41c21-utilities\") pod \"community-operators-mxm8x\" (UID: \"2a8f57b2-1e50-4720-b9ec-832cc2e41c21\") " pod="openshift-marketplace/community-operators-mxm8x" Feb 17 14:35:01 crc kubenswrapper[4804]: I0217 14:35:01.514930 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a8f57b2-1e50-4720-b9ec-832cc2e41c21-catalog-content\") pod \"community-operators-mxm8x\" (UID: \"2a8f57b2-1e50-4720-b9ec-832cc2e41c21\") " pod="openshift-marketplace/community-operators-mxm8x" Feb 17 14:35:01 crc kubenswrapper[4804]: I0217 14:35:01.515009 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a8f57b2-1e50-4720-b9ec-832cc2e41c21-utilities\") pod \"community-operators-mxm8x\" (UID: \"2a8f57b2-1e50-4720-b9ec-832cc2e41c21\") " pod="openshift-marketplace/community-operators-mxm8x" Feb 17 14:35:01 crc kubenswrapper[4804]: I0217 14:35:01.859469 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2zd8\" (UniqueName: \"kubernetes.io/projected/2a8f57b2-1e50-4720-b9ec-832cc2e41c21-kube-api-access-j2zd8\") pod \"community-operators-mxm8x\" (UID: \"2a8f57b2-1e50-4720-b9ec-832cc2e41c21\") " pod="openshift-marketplace/community-operators-mxm8x" Feb 17 14:35:01 crc kubenswrapper[4804]: I0217 14:35:01.937789 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxm8x" Feb 17 14:35:02 crc kubenswrapper[4804]: I0217 14:35:02.368490 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mxm8x"] Feb 17 14:35:02 crc kubenswrapper[4804]: I0217 14:35:02.397628 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxm8x" event={"ID":"2a8f57b2-1e50-4720-b9ec-832cc2e41c21","Type":"ContainerStarted","Data":"ca5f5f0230bc282ba835999b8fdde5d2b83e78fcf1ef6d89ebdf7902bbe288d1"} Feb 17 14:35:03 crc kubenswrapper[4804]: I0217 14:35:03.406844 4804 generic.go:334] "Generic (PLEG): container finished" podID="2a8f57b2-1e50-4720-b9ec-832cc2e41c21" containerID="16bd6d08c8ac45d420e0b3f925e8936f91020c836bb90671fb2dda02634b13c6" exitCode=0 Feb 17 14:35:03 crc kubenswrapper[4804]: I0217 14:35:03.407061 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxm8x" event={"ID":"2a8f57b2-1e50-4720-b9ec-832cc2e41c21","Type":"ContainerDied","Data":"16bd6d08c8ac45d420e0b3f925e8936f91020c836bb90671fb2dda02634b13c6"} Feb 17 14:35:05 crc kubenswrapper[4804]: I0217 14:35:05.422209 4804 generic.go:334] "Generic (PLEG): container finished" podID="2a8f57b2-1e50-4720-b9ec-832cc2e41c21" containerID="8985e5f7b194121e8dd9023d1e3d791cdf662048d6b8ea179bd1ac90bf008c4c" exitCode=0 Feb 17 14:35:05 crc kubenswrapper[4804]: I0217 14:35:05.422341 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxm8x" event={"ID":"2a8f57b2-1e50-4720-b9ec-832cc2e41c21","Type":"ContainerDied","Data":"8985e5f7b194121e8dd9023d1e3d791cdf662048d6b8ea179bd1ac90bf008c4c"} Feb 17 14:35:06 crc kubenswrapper[4804]: I0217 14:35:06.436590 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxm8x" event={"ID":"2a8f57b2-1e50-4720-b9ec-832cc2e41c21","Type":"ContainerStarted","Data":"92dc22c59f067daa996bc55081ae7c70b39125629d95e2fe30eeac19a6291a64"} Feb 17 14:35:06 crc kubenswrapper[4804]: I0217 14:35:06.469505 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mxm8x" podStartSLOduration=3.038177591 podStartE2EDuration="5.469482854s" podCreationTimestamp="2026-02-17 14:35:01 +0000 UTC" firstStartedPulling="2026-02-17 14:35:03.409363334 +0000 UTC m=+4177.520782671" lastFinishedPulling="2026-02-17 14:35:05.840668597 +0000 UTC m=+4179.952087934" observedRunningTime="2026-02-17 14:35:06.463715213 +0000 UTC m=+4180.575134540" watchObservedRunningTime="2026-02-17 14:35:06.469482854 +0000 UTC m=+4180.580902191" Feb 17 14:35:11 crc kubenswrapper[4804]: I0217 14:35:11.938296 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mxm8x" Feb 17 14:35:11 crc kubenswrapper[4804]: I0217 14:35:11.938682 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mxm8x" Feb 17 14:35:11 crc kubenswrapper[4804]: I0217 14:35:11.991318 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mxm8x" Feb 17 14:35:12 crc kubenswrapper[4804]: I0217 14:35:12.537264 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mxm8x" Feb 17 14:35:13 crc kubenswrapper[4804]: I0217 14:35:13.509325 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mxm8x"] Feb 17 14:35:14 crc kubenswrapper[4804]: I0217 14:35:14.504244 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mxm8x" podUID="2a8f57b2-1e50-4720-b9ec-832cc2e41c21" containerName="registry-server" containerID="cri-o://92dc22c59f067daa996bc55081ae7c70b39125629d95e2fe30eeac19a6291a64" gracePeriod=2 Feb 17 14:35:14 crc kubenswrapper[4804]: I0217 14:35:14.968650 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxm8x" Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.069521 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2zd8\" (UniqueName: \"kubernetes.io/projected/2a8f57b2-1e50-4720-b9ec-832cc2e41c21-kube-api-access-j2zd8\") pod \"2a8f57b2-1e50-4720-b9ec-832cc2e41c21\" (UID: \"2a8f57b2-1e50-4720-b9ec-832cc2e41c21\") " Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.069620 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a8f57b2-1e50-4720-b9ec-832cc2e41c21-utilities\") pod \"2a8f57b2-1e50-4720-b9ec-832cc2e41c21\" (UID: \"2a8f57b2-1e50-4720-b9ec-832cc2e41c21\") " Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.069663 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a8f57b2-1e50-4720-b9ec-832cc2e41c21-catalog-content\") pod \"2a8f57b2-1e50-4720-b9ec-832cc2e41c21\" (UID: \"2a8f57b2-1e50-4720-b9ec-832cc2e41c21\") " Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.070612 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a8f57b2-1e50-4720-b9ec-832cc2e41c21-utilities" (OuterVolumeSpecName: "utilities") pod "2a8f57b2-1e50-4720-b9ec-832cc2e41c21" (UID: "2a8f57b2-1e50-4720-b9ec-832cc2e41c21"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.076271 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a8f57b2-1e50-4720-b9ec-832cc2e41c21-kube-api-access-j2zd8" (OuterVolumeSpecName: "kube-api-access-j2zd8") pod "2a8f57b2-1e50-4720-b9ec-832cc2e41c21" (UID: "2a8f57b2-1e50-4720-b9ec-832cc2e41c21"). InnerVolumeSpecName "kube-api-access-j2zd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.125349 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a8f57b2-1e50-4720-b9ec-832cc2e41c21-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2a8f57b2-1e50-4720-b9ec-832cc2e41c21" (UID: "2a8f57b2-1e50-4720-b9ec-832cc2e41c21"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.171775 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2zd8\" (UniqueName: \"kubernetes.io/projected/2a8f57b2-1e50-4720-b9ec-832cc2e41c21-kube-api-access-j2zd8\") on node \"crc\" DevicePath \"\"" Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.171831 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a8f57b2-1e50-4720-b9ec-832cc2e41c21-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.171847 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a8f57b2-1e50-4720-b9ec-832cc2e41c21-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.515278 4804 generic.go:334] "Generic (PLEG): container finished" podID="2a8f57b2-1e50-4720-b9ec-832cc2e41c21" containerID="92dc22c59f067daa996bc55081ae7c70b39125629d95e2fe30eeac19a6291a64" exitCode=0 Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.515328 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxm8x" event={"ID":"2a8f57b2-1e50-4720-b9ec-832cc2e41c21","Type":"ContainerDied","Data":"92dc22c59f067daa996bc55081ae7c70b39125629d95e2fe30eeac19a6291a64"} Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.515351 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxm8x" Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.515364 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxm8x" event={"ID":"2a8f57b2-1e50-4720-b9ec-832cc2e41c21","Type":"ContainerDied","Data":"ca5f5f0230bc282ba835999b8fdde5d2b83e78fcf1ef6d89ebdf7902bbe288d1"} Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.515386 4804 scope.go:117] "RemoveContainer" containerID="92dc22c59f067daa996bc55081ae7c70b39125629d95e2fe30eeac19a6291a64" Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.532320 4804 scope.go:117] "RemoveContainer" containerID="8985e5f7b194121e8dd9023d1e3d791cdf662048d6b8ea179bd1ac90bf008c4c" Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.553067 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mxm8x"] Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.556277 4804 scope.go:117] "RemoveContainer" containerID="16bd6d08c8ac45d420e0b3f925e8936f91020c836bb90671fb2dda02634b13c6" Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.561019 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mxm8x"] Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.598744 4804 scope.go:117] "RemoveContainer" containerID="92dc22c59f067daa996bc55081ae7c70b39125629d95e2fe30eeac19a6291a64" Feb 17 14:35:15 crc kubenswrapper[4804]: E0217 14:35:15.599248 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92dc22c59f067daa996bc55081ae7c70b39125629d95e2fe30eeac19a6291a64\": container with ID starting with 92dc22c59f067daa996bc55081ae7c70b39125629d95e2fe30eeac19a6291a64 not found: ID does not exist" containerID="92dc22c59f067daa996bc55081ae7c70b39125629d95e2fe30eeac19a6291a64" Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.599347 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92dc22c59f067daa996bc55081ae7c70b39125629d95e2fe30eeac19a6291a64"} err="failed to get container status \"92dc22c59f067daa996bc55081ae7c70b39125629d95e2fe30eeac19a6291a64\": rpc error: code = NotFound desc = could not find container \"92dc22c59f067daa996bc55081ae7c70b39125629d95e2fe30eeac19a6291a64\": container with ID starting with 92dc22c59f067daa996bc55081ae7c70b39125629d95e2fe30eeac19a6291a64 not found: ID does not exist" Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.599426 4804 scope.go:117] "RemoveContainer" containerID="8985e5f7b194121e8dd9023d1e3d791cdf662048d6b8ea179bd1ac90bf008c4c" Feb 17 14:35:15 crc kubenswrapper[4804]: E0217 14:35:15.599735 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8985e5f7b194121e8dd9023d1e3d791cdf662048d6b8ea179bd1ac90bf008c4c\": container with ID starting with 8985e5f7b194121e8dd9023d1e3d791cdf662048d6b8ea179bd1ac90bf008c4c not found: ID does not exist" containerID="8985e5f7b194121e8dd9023d1e3d791cdf662048d6b8ea179bd1ac90bf008c4c" Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.599829 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8985e5f7b194121e8dd9023d1e3d791cdf662048d6b8ea179bd1ac90bf008c4c"} err="failed to get container status \"8985e5f7b194121e8dd9023d1e3d791cdf662048d6b8ea179bd1ac90bf008c4c\": rpc error: code = NotFound desc = could not find container \"8985e5f7b194121e8dd9023d1e3d791cdf662048d6b8ea179bd1ac90bf008c4c\": container with ID starting with 8985e5f7b194121e8dd9023d1e3d791cdf662048d6b8ea179bd1ac90bf008c4c not found: ID does not exist" Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.599911 4804 scope.go:117] "RemoveContainer" containerID="16bd6d08c8ac45d420e0b3f925e8936f91020c836bb90671fb2dda02634b13c6" Feb 17 14:35:15 crc kubenswrapper[4804]: E0217 14:35:15.600192 4804 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16bd6d08c8ac45d420e0b3f925e8936f91020c836bb90671fb2dda02634b13c6\": container with ID starting with 16bd6d08c8ac45d420e0b3f925e8936f91020c836bb90671fb2dda02634b13c6 not found: ID does not exist" containerID="16bd6d08c8ac45d420e0b3f925e8936f91020c836bb90671fb2dda02634b13c6" Feb 17 14:35:15 crc kubenswrapper[4804]: I0217 14:35:15.600315 4804 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16bd6d08c8ac45d420e0b3f925e8936f91020c836bb90671fb2dda02634b13c6"} err="failed to get container status \"16bd6d08c8ac45d420e0b3f925e8936f91020c836bb90671fb2dda02634b13c6\": rpc error: code = NotFound desc = could not find container \"16bd6d08c8ac45d420e0b3f925e8936f91020c836bb90671fb2dda02634b13c6\": container with ID starting with 16bd6d08c8ac45d420e0b3f925e8936f91020c836bb90671fb2dda02634b13c6 not found: ID does not exist" Feb 17 14:35:16 crc kubenswrapper[4804]: I0217 14:35:16.587351 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a8f57b2-1e50-4720-b9ec-832cc2e41c21" path="/var/lib/kubelet/pods/2a8f57b2-1e50-4720-b9ec-832cc2e41c21/volumes" Feb 17 14:35:18 crc kubenswrapper[4804]: I0217 14:35:18.677902 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-wg4pd_01625c42-e1b1-470d-b705-47b30fec457a/kube-rbac-proxy/0.log" Feb 17 14:35:18 crc kubenswrapper[4804]: I0217 14:35:18.820497 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-wg4pd_01625c42-e1b1-470d-b705-47b30fec457a/controller/0.log" Feb 17 14:35:19 crc kubenswrapper[4804]: I0217 14:35:19.014740 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-frr-files/0.log" Feb 17 14:35:19 crc kubenswrapper[4804]: I0217 14:35:19.231995 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-frr-files/0.log" Feb 17 14:35:19 crc kubenswrapper[4804]: I0217 14:35:19.241177 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-reloader/0.log" Feb 17 14:35:19 crc kubenswrapper[4804]: I0217 14:35:19.255450 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-metrics/0.log" Feb 17 14:35:19 crc kubenswrapper[4804]: I0217 14:35:19.263310 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-reloader/0.log" Feb 17 14:35:19 crc kubenswrapper[4804]: I0217 14:35:19.472342 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-metrics/0.log" Feb 17 14:35:19 crc kubenswrapper[4804]: I0217 14:35:19.493923 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-reloader/0.log" Feb 17 14:35:19 crc kubenswrapper[4804]: I0217 14:35:19.514321 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-frr-files/0.log" Feb 17 14:35:19 crc kubenswrapper[4804]: I0217 14:35:19.560710 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-metrics/0.log" Feb 17 14:35:19 crc kubenswrapper[4804]: I0217 14:35:19.672964 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-frr-files/0.log" Feb 17 14:35:19 crc kubenswrapper[4804]: I0217 14:35:19.680454 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-reloader/0.log" Feb 17 14:35:19 crc kubenswrapper[4804]: I0217 14:35:19.715748 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/cp-metrics/0.log" Feb 17 14:35:19 crc kubenswrapper[4804]: I0217 14:35:19.777227 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/controller/0.log" Feb 17 14:35:19 crc kubenswrapper[4804]: I0217 14:35:19.891015 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/frr-metrics/0.log" Feb 17 14:35:19 crc kubenswrapper[4804]: I0217 14:35:19.932078 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/kube-rbac-proxy/0.log" Feb 17 14:35:19 crc kubenswrapper[4804]: I0217 14:35:19.981732 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/kube-rbac-proxy-frr/0.log" Feb 17 14:35:20 crc kubenswrapper[4804]: I0217 14:35:20.130244 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/reloader/0.log" Feb 17 14:35:20 crc kubenswrapper[4804]: I0217 14:35:20.260261 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-gl8tp_0d003d1c-2370-4291-a035-0ebe8b97cfee/frr-k8s-webhook-server/0.log" Feb 17 14:35:20 crc kubenswrapper[4804]: I0217 14:35:20.419595 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-c7c468df9-kbjlb_c17333d4-cfc6-4129-af9e-a8f2db54988b/manager/0.log" Feb 17 14:35:20 crc kubenswrapper[4804]: I0217 14:35:20.654694 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-996ff79d9-vm8dt_82716046-7f15-43d7-b9de-8fdb68a44c0b/webhook-server/0.log" Feb 17 14:35:20 crc kubenswrapper[4804]: I0217 14:35:20.792028 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wrsrf_ef60181c-19a6-454c-a197-2b0af0ac2edf/kube-rbac-proxy/0.log" Feb 17 14:35:21 crc kubenswrapper[4804]: I0217 14:35:21.370266 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wrsrf_ef60181c-19a6-454c-a197-2b0af0ac2edf/speaker/0.log" Feb 17 14:35:21 crc kubenswrapper[4804]: I0217 14:35:21.469021 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5ls9t_2cf110f6-e70a-45af-a634-744262733250/frr/0.log" Feb 17 14:35:34 crc kubenswrapper[4804]: I0217 14:35:34.009353 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h_7e8c98d2-433f-46f9-a2f3-3a368c1b2608/util/0.log" Feb 17 14:35:34 crc kubenswrapper[4804]: I0217 14:35:34.209495 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h_7e8c98d2-433f-46f9-a2f3-3a368c1b2608/util/0.log" Feb 17 14:35:34 crc kubenswrapper[4804]: I0217 14:35:34.216792 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h_7e8c98d2-433f-46f9-a2f3-3a368c1b2608/pull/0.log" Feb 17 14:35:34 crc kubenswrapper[4804]: I0217 14:35:34.250667 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h_7e8c98d2-433f-46f9-a2f3-3a368c1b2608/pull/0.log" Feb 17 14:35:34 crc kubenswrapper[4804]: I0217 14:35:34.442150 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h_7e8c98d2-433f-46f9-a2f3-3a368c1b2608/pull/0.log" Feb 17 14:35:34 crc kubenswrapper[4804]: I0217 14:35:34.450752 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h_7e8c98d2-433f-46f9-a2f3-3a368c1b2608/extract/0.log" Feb 17 14:35:34 crc kubenswrapper[4804]: I0217 14:35:34.456007 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213dcs6h_7e8c98d2-433f-46f9-a2f3-3a368c1b2608/util/0.log" Feb 17 14:35:34 crc kubenswrapper[4804]: I0217 14:35:34.647449 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jhxhx_5816c991-ba5a-4d3c-9d69-d28846ca92f6/extract-utilities/0.log" Feb 17 14:35:34 crc kubenswrapper[4804]: I0217 14:35:34.796987 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jhxhx_5816c991-ba5a-4d3c-9d69-d28846ca92f6/extract-content/0.log" Feb 17 14:35:34 crc kubenswrapper[4804]: I0217 14:35:34.800269 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jhxhx_5816c991-ba5a-4d3c-9d69-d28846ca92f6/extract-utilities/0.log" Feb 17 14:35:34 crc kubenswrapper[4804]: I0217 14:35:34.817658 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jhxhx_5816c991-ba5a-4d3c-9d69-d28846ca92f6/extract-content/0.log" Feb 17 14:35:34 crc kubenswrapper[4804]: I0217 14:35:34.982063 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jhxhx_5816c991-ba5a-4d3c-9d69-d28846ca92f6/extract-utilities/0.log" Feb 17 14:35:34 crc kubenswrapper[4804]: I0217 14:35:34.987890 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jhxhx_5816c991-ba5a-4d3c-9d69-d28846ca92f6/extract-content/0.log" Feb 17 14:35:35 crc kubenswrapper[4804]: I0217 14:35:35.203945 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m2bjw_57d3429b-b2f5-49ea-94b2-b79aa1769367/extract-utilities/0.log" Feb 17 14:35:35 crc kubenswrapper[4804]: I0217 14:35:35.404352 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m2bjw_57d3429b-b2f5-49ea-94b2-b79aa1769367/extract-content/0.log" Feb 17 14:35:35 crc kubenswrapper[4804]: I0217 14:35:35.409262 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m2bjw_57d3429b-b2f5-49ea-94b2-b79aa1769367/extract-utilities/0.log" Feb 17 14:35:35 crc kubenswrapper[4804]: I0217 14:35:35.508983 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m2bjw_57d3429b-b2f5-49ea-94b2-b79aa1769367/extract-content/0.log" Feb 17 14:35:35 crc kubenswrapper[4804]: I0217 14:35:35.585881 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jhxhx_5816c991-ba5a-4d3c-9d69-d28846ca92f6/registry-server/0.log" Feb 17 14:35:35 crc kubenswrapper[4804]: I0217 14:35:35.733530 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m2bjw_57d3429b-b2f5-49ea-94b2-b79aa1769367/extract-content/0.log" Feb 17 14:35:35 crc kubenswrapper[4804]: I0217 14:35:35.768416 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m2bjw_57d3429b-b2f5-49ea-94b2-b79aa1769367/extract-utilities/0.log" Feb 17 14:35:35 crc kubenswrapper[4804]: I0217 14:35:35.963781 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d_17c12921-34cb-4c2e-9cb8-585348e46d30/util/0.log" Feb 17 14:35:36 crc kubenswrapper[4804]: I0217 14:35:36.168893 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d_17c12921-34cb-4c2e-9cb8-585348e46d30/util/0.log" Feb 17 14:35:36 crc kubenswrapper[4804]: I0217 14:35:36.232512 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d_17c12921-34cb-4c2e-9cb8-585348e46d30/pull/0.log" Feb 17 14:35:36 crc kubenswrapper[4804]: I0217 14:35:36.244496 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d_17c12921-34cb-4c2e-9cb8-585348e46d30/pull/0.log" Feb 17 14:35:36 crc kubenswrapper[4804]: I0217 14:35:36.445895 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d_17c12921-34cb-4c2e-9cb8-585348e46d30/extract/0.log" Feb 17 14:35:36 crc kubenswrapper[4804]: I0217 14:35:36.481934 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d_17c12921-34cb-4c2e-9cb8-585348e46d30/util/0.log" Feb 17 14:35:36 crc kubenswrapper[4804]: I0217 14:35:36.485032 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab4s9d_17c12921-34cb-4c2e-9cb8-585348e46d30/pull/0.log" Feb 17 14:35:36 crc kubenswrapper[4804]: I0217 14:35:36.689256 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m2bjw_57d3429b-b2f5-49ea-94b2-b79aa1769367/registry-server/0.log" Feb 17 14:35:36 crc kubenswrapper[4804]: I0217 14:35:36.732843 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-26cwx_78a56ea9-6641-4d2d-8471-b40e5f2cf7e5/marketplace-operator/0.log" Feb 17 14:35:36 crc kubenswrapper[4804]: I0217 14:35:36.883245 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5fs82_e7d80260-64fd-4975-a620-5c515a765fd3/extract-utilities/0.log" Feb 17 14:35:37 crc kubenswrapper[4804]: I0217 14:35:37.051321 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5fs82_e7d80260-64fd-4975-a620-5c515a765fd3/extract-utilities/0.log" Feb 17 14:35:37 crc kubenswrapper[4804]: I0217 14:35:37.059580 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5fs82_e7d80260-64fd-4975-a620-5c515a765fd3/extract-content/0.log" Feb 17 14:35:37 crc kubenswrapper[4804]: I0217 14:35:37.073058 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5fs82_e7d80260-64fd-4975-a620-5c515a765fd3/extract-content/0.log" Feb 17 14:35:37 crc kubenswrapper[4804]: I0217 14:35:37.258799 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5fs82_e7d80260-64fd-4975-a620-5c515a765fd3/extract-utilities/0.log" Feb 17 14:35:37 crc kubenswrapper[4804]: I0217 14:35:37.298901 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5fs82_e7d80260-64fd-4975-a620-5c515a765fd3/extract-content/0.log" Feb 17 14:35:37 crc kubenswrapper[4804]: I0217 14:35:37.427327 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5fs82_e7d80260-64fd-4975-a620-5c515a765fd3/registry-server/0.log" Feb 17 14:35:37 crc kubenswrapper[4804]: I0217 14:35:37.444106 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zn8mk_3aa554a7-2c33-433d-89c1-403c44aa0215/extract-utilities/0.log" Feb 17 14:35:37 crc kubenswrapper[4804]: I0217 14:35:37.621755 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zn8mk_3aa554a7-2c33-433d-89c1-403c44aa0215/extract-utilities/0.log" Feb 17 14:35:37 crc kubenswrapper[4804]: I0217 14:35:37.658724 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zn8mk_3aa554a7-2c33-433d-89c1-403c44aa0215/extract-content/0.log" Feb 17 14:35:37 crc kubenswrapper[4804]: I0217 14:35:37.660806 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zn8mk_3aa554a7-2c33-433d-89c1-403c44aa0215/extract-content/0.log" Feb 17 14:35:37 crc kubenswrapper[4804]: I0217 14:35:37.821927 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zn8mk_3aa554a7-2c33-433d-89c1-403c44aa0215/extract-content/0.log" Feb 17 14:35:37 crc kubenswrapper[4804]: I0217 14:35:37.829516 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zn8mk_3aa554a7-2c33-433d-89c1-403c44aa0215/extract-utilities/0.log" Feb 17 14:35:37 crc kubenswrapper[4804]: I0217 14:35:37.967590 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zn8mk_3aa554a7-2c33-433d-89c1-403c44aa0215/registry-server/0.log" Feb 17 14:36:44 crc kubenswrapper[4804]: I0217 14:36:44.985573 4804 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8cjvt"] Feb 17 14:36:44 crc kubenswrapper[4804]: E0217 14:36:44.986991 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a8f57b2-1e50-4720-b9ec-832cc2e41c21" containerName="extract-utilities" Feb 17 14:36:44 crc kubenswrapper[4804]: I0217 14:36:44.987008 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a8f57b2-1e50-4720-b9ec-832cc2e41c21" containerName="extract-utilities" Feb 17 14:36:44 crc kubenswrapper[4804]: E0217 14:36:44.987029 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a8f57b2-1e50-4720-b9ec-832cc2e41c21" containerName="extract-content" Feb 17 14:36:44 crc kubenswrapper[4804]: I0217 14:36:44.987037 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a8f57b2-1e50-4720-b9ec-832cc2e41c21" containerName="extract-content" Feb 17 14:36:44 crc kubenswrapper[4804]: E0217 14:36:44.987078 4804 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a8f57b2-1e50-4720-b9ec-832cc2e41c21" containerName="registry-server" Feb 17 14:36:44 crc kubenswrapper[4804]: I0217 14:36:44.987089 4804 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a8f57b2-1e50-4720-b9ec-832cc2e41c21" containerName="registry-server" Feb 17 14:36:44 crc kubenswrapper[4804]: I0217 14:36:44.987395 4804 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a8f57b2-1e50-4720-b9ec-832cc2e41c21" containerName="registry-server" Feb 17 14:36:44 crc kubenswrapper[4804]: I0217 14:36:44.988933 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8cjvt" Feb 17 14:36:45 crc kubenswrapper[4804]: I0217 14:36:45.000545 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8cjvt"] Feb 17 14:36:45 crc kubenswrapper[4804]: I0217 14:36:45.157000 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24bb8e4f-1bc0-4422-877c-3b9f26a4ded0-catalog-content\") pod \"certified-operators-8cjvt\" (UID: \"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0\") " pod="openshift-marketplace/certified-operators-8cjvt" Feb 17 14:36:45 crc kubenswrapper[4804]: I0217 14:36:45.157103 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pbc5\" (UniqueName: \"kubernetes.io/projected/24bb8e4f-1bc0-4422-877c-3b9f26a4ded0-kube-api-access-7pbc5\") pod \"certified-operators-8cjvt\" (UID: \"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0\") " pod="openshift-marketplace/certified-operators-8cjvt" Feb 17 14:36:45 crc kubenswrapper[4804]: I0217 14:36:45.157154 4804 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24bb8e4f-1bc0-4422-877c-3b9f26a4ded0-utilities\") pod \"certified-operators-8cjvt\" (UID: \"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0\") " pod="openshift-marketplace/certified-operators-8cjvt" Feb 17 14:36:45 crc kubenswrapper[4804]: I0217 14:36:45.258990 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pbc5\" (UniqueName: \"kubernetes.io/projected/24bb8e4f-1bc0-4422-877c-3b9f26a4ded0-kube-api-access-7pbc5\") pod \"certified-operators-8cjvt\" (UID: \"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0\") " pod="openshift-marketplace/certified-operators-8cjvt" Feb 17 14:36:45 crc kubenswrapper[4804]: I0217 14:36:45.259464 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24bb8e4f-1bc0-4422-877c-3b9f26a4ded0-utilities\") pod \"certified-operators-8cjvt\" (UID: \"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0\") " pod="openshift-marketplace/certified-operators-8cjvt" Feb 17 14:36:45 crc kubenswrapper[4804]: I0217 14:36:45.259889 4804 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24bb8e4f-1bc0-4422-877c-3b9f26a4ded0-catalog-content\") pod \"certified-operators-8cjvt\" (UID: \"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0\") " pod="openshift-marketplace/certified-operators-8cjvt" Feb 17 14:36:45 crc kubenswrapper[4804]: I0217 14:36:45.259987 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24bb8e4f-1bc0-4422-877c-3b9f26a4ded0-utilities\") pod \"certified-operators-8cjvt\" (UID: \"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0\") " pod="openshift-marketplace/certified-operators-8cjvt" Feb 17 14:36:45 crc kubenswrapper[4804]: I0217 14:36:45.260695 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24bb8e4f-1bc0-4422-877c-3b9f26a4ded0-catalog-content\") pod \"certified-operators-8cjvt\" (UID: \"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0\") " pod="openshift-marketplace/certified-operators-8cjvt" Feb 17 14:36:45 crc kubenswrapper[4804]: I0217 14:36:45.295730 4804 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pbc5\" (UniqueName: \"kubernetes.io/projected/24bb8e4f-1bc0-4422-877c-3b9f26a4ded0-kube-api-access-7pbc5\") pod \"certified-operators-8cjvt\" (UID: \"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0\") " pod="openshift-marketplace/certified-operators-8cjvt" Feb 17 14:36:45 crc kubenswrapper[4804]: I0217 14:36:45.325976 4804 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8cjvt" Feb 17 14:36:45 crc kubenswrapper[4804]: I0217 14:36:45.863149 4804 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8cjvt"] Feb 17 14:36:46 crc kubenswrapper[4804]: E0217 14:36:46.249526 4804 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24bb8e4f_1bc0_4422_877c_3b9f26a4ded0.slice/crio-conmon-502702fc4c5f477b627accb815f1a4f523409bd0ef1fb1794ca8a6b6389d9199.scope\": RecentStats: unable to find data in memory cache]" Feb 17 14:36:46 crc kubenswrapper[4804]: I0217 14:36:46.395417 4804 generic.go:334] "Generic (PLEG): container finished" podID="24bb8e4f-1bc0-4422-877c-3b9f26a4ded0" containerID="502702fc4c5f477b627accb815f1a4f523409bd0ef1fb1794ca8a6b6389d9199" exitCode=0 Feb 17 14:36:46 crc kubenswrapper[4804]: I0217 14:36:46.395510 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8cjvt" event={"ID":"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0","Type":"ContainerDied","Data":"502702fc4c5f477b627accb815f1a4f523409bd0ef1fb1794ca8a6b6389d9199"} Feb 17 14:36:46 crc kubenswrapper[4804]: I0217 14:36:46.395826 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8cjvt" event={"ID":"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0","Type":"ContainerStarted","Data":"bfe6afe0591c71ea93e7728ff8ca0d1783e36d1a1e8760a729d643c649ba52b5"} Feb 17 14:36:46 crc kubenswrapper[4804]: I0217 14:36:46.398303 4804 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 14:36:52 crc kubenswrapper[4804]: I0217 14:36:52.454724 4804 generic.go:334] "Generic (PLEG): container finished" podID="24bb8e4f-1bc0-4422-877c-3b9f26a4ded0" containerID="0bcf1df09a991fc2dc58734e56b601ca1908332ce3dee158c9bd719c118aac29" exitCode=0 Feb 17 14:36:52 crc kubenswrapper[4804]: I0217 14:36:52.454809 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8cjvt" event={"ID":"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0","Type":"ContainerDied","Data":"0bcf1df09a991fc2dc58734e56b601ca1908332ce3dee158c9bd719c118aac29"} Feb 17 14:36:53 crc kubenswrapper[4804]: I0217 14:36:53.464930 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8cjvt" event={"ID":"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0","Type":"ContainerStarted","Data":"c26f8a66dc3f1a3e4f6b2b927be5f18e96692a3cd83154e44f53aae9783d2efd"} Feb 17 14:36:53 crc kubenswrapper[4804]: I0217 14:36:53.490830 4804 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8cjvt" podStartSLOduration=3.039248967 podStartE2EDuration="9.490806715s" podCreationTimestamp="2026-02-17 14:36:44 +0000 UTC" firstStartedPulling="2026-02-17 14:36:46.398061542 +0000 UTC m=+4280.509480879" lastFinishedPulling="2026-02-17 14:36:52.84961927 +0000 UTC m=+4286.961038627" observedRunningTime="2026-02-17 14:36:53.482325669 +0000 UTC m=+4287.593745016" watchObservedRunningTime="2026-02-17 14:36:53.490806715 +0000 UTC m=+4287.602226062" Feb 17 14:36:55 crc kubenswrapper[4804]: I0217 14:36:55.326939 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8cjvt" Feb 17 14:36:55 crc kubenswrapper[4804]: I0217 14:36:55.327341 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8cjvt" Feb 17 14:36:55 crc kubenswrapper[4804]: I0217 14:36:55.402971 4804 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8cjvt" Feb 17 14:36:55 crc kubenswrapper[4804]: I0217 14:36:55.835023 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:36:55 crc kubenswrapper[4804]: I0217 14:36:55.835095 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:37:05 crc kubenswrapper[4804]: I0217 14:37:05.388380 4804 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8cjvt" Feb 17 14:37:05 crc kubenswrapper[4804]: I0217 14:37:05.456917 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8cjvt"] Feb 17 14:37:05 crc kubenswrapper[4804]: I0217 14:37:05.574615 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8cjvt" podUID="24bb8e4f-1bc0-4422-877c-3b9f26a4ded0" containerName="registry-server" containerID="cri-o://c26f8a66dc3f1a3e4f6b2b927be5f18e96692a3cd83154e44f53aae9783d2efd" gracePeriod=2 Feb 17 14:37:06 crc kubenswrapper[4804]: I0217 14:37:06.587099 4804 generic.go:334] "Generic (PLEG): container finished" podID="24bb8e4f-1bc0-4422-877c-3b9f26a4ded0" containerID="c26f8a66dc3f1a3e4f6b2b927be5f18e96692a3cd83154e44f53aae9783d2efd" exitCode=0 Feb 17 14:37:06 crc kubenswrapper[4804]: I0217 14:37:06.598339 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8cjvt" event={"ID":"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0","Type":"ContainerDied","Data":"c26f8a66dc3f1a3e4f6b2b927be5f18e96692a3cd83154e44f53aae9783d2efd"} Feb 17 14:37:06 crc kubenswrapper[4804]: I0217 14:37:06.692415 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8cjvt" Feb 17 14:37:06 crc kubenswrapper[4804]: I0217 14:37:06.790717 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24bb8e4f-1bc0-4422-877c-3b9f26a4ded0-utilities\") pod \"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0\" (UID: \"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0\") " Feb 17 14:37:06 crc kubenswrapper[4804]: I0217 14:37:06.790957 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pbc5\" (UniqueName: \"kubernetes.io/projected/24bb8e4f-1bc0-4422-877c-3b9f26a4ded0-kube-api-access-7pbc5\") pod \"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0\" (UID: \"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0\") " Feb 17 14:37:06 crc kubenswrapper[4804]: I0217 14:37:06.791002 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24bb8e4f-1bc0-4422-877c-3b9f26a4ded0-catalog-content\") pod \"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0\" (UID: \"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0\") " Feb 17 14:37:06 crc kubenswrapper[4804]: I0217 14:37:06.791424 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24bb8e4f-1bc0-4422-877c-3b9f26a4ded0-utilities" (OuterVolumeSpecName: "utilities") pod "24bb8e4f-1bc0-4422-877c-3b9f26a4ded0" (UID: "24bb8e4f-1bc0-4422-877c-3b9f26a4ded0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:37:06 crc kubenswrapper[4804]: I0217 14:37:06.798654 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24bb8e4f-1bc0-4422-877c-3b9f26a4ded0-kube-api-access-7pbc5" (OuterVolumeSpecName: "kube-api-access-7pbc5") pod "24bb8e4f-1bc0-4422-877c-3b9f26a4ded0" (UID: "24bb8e4f-1bc0-4422-877c-3b9f26a4ded0"). InnerVolumeSpecName "kube-api-access-7pbc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:37:06 crc kubenswrapper[4804]: I0217 14:37:06.870114 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24bb8e4f-1bc0-4422-877c-3b9f26a4ded0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "24bb8e4f-1bc0-4422-877c-3b9f26a4ded0" (UID: "24bb8e4f-1bc0-4422-877c-3b9f26a4ded0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:37:06 crc kubenswrapper[4804]: I0217 14:37:06.893214 4804 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24bb8e4f-1bc0-4422-877c-3b9f26a4ded0-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:37:06 crc kubenswrapper[4804]: I0217 14:37:06.893484 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pbc5\" (UniqueName: \"kubernetes.io/projected/24bb8e4f-1bc0-4422-877c-3b9f26a4ded0-kube-api-access-7pbc5\") on node \"crc\" DevicePath \"\"" Feb 17 14:37:06 crc kubenswrapper[4804]: I0217 14:37:06.893557 4804 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24bb8e4f-1bc0-4422-877c-3b9f26a4ded0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:37:07 crc kubenswrapper[4804]: I0217 14:37:07.598999 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8cjvt" event={"ID":"24bb8e4f-1bc0-4422-877c-3b9f26a4ded0","Type":"ContainerDied","Data":"bfe6afe0591c71ea93e7728ff8ca0d1783e36d1a1e8760a729d643c649ba52b5"} Feb 17 14:37:07 crc kubenswrapper[4804]: I0217 14:37:07.599057 4804 scope.go:117] "RemoveContainer" containerID="c26f8a66dc3f1a3e4f6b2b927be5f18e96692a3cd83154e44f53aae9783d2efd" Feb 17 14:37:07 crc kubenswrapper[4804]: I0217 14:37:07.599253 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8cjvt" Feb 17 14:37:07 crc kubenswrapper[4804]: I0217 14:37:07.622446 4804 scope.go:117] "RemoveContainer" containerID="0bcf1df09a991fc2dc58734e56b601ca1908332ce3dee158c9bd719c118aac29" Feb 17 14:37:07 crc kubenswrapper[4804]: I0217 14:37:07.665439 4804 scope.go:117] "RemoveContainer" containerID="502702fc4c5f477b627accb815f1a4f523409bd0ef1fb1794ca8a6b6389d9199" Feb 17 14:37:07 crc kubenswrapper[4804]: I0217 14:37:07.674158 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8cjvt"] Feb 17 14:37:07 crc kubenswrapper[4804]: I0217 14:37:07.684253 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8cjvt"] Feb 17 14:37:08 crc kubenswrapper[4804]: I0217 14:37:08.585021 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24bb8e4f-1bc0-4422-877c-3b9f26a4ded0" path="/var/lib/kubelet/pods/24bb8e4f-1bc0-4422-877c-3b9f26a4ded0/volumes" Feb 17 14:37:25 crc kubenswrapper[4804]: I0217 14:37:25.790481 4804 generic.go:334] "Generic (PLEG): container finished" podID="de9029fd-fb98-4bf0-a6fc-0baf663a4e92" containerID="ba8a99b1d53310cd598e93015ecdc8bff1c0871f8d9af2216aa4262da6b1fde1" exitCode=0 Feb 17 14:37:25 crc kubenswrapper[4804]: I0217 14:37:25.790573 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6mdsd/must-gather-64wjc" event={"ID":"de9029fd-fb98-4bf0-a6fc-0baf663a4e92","Type":"ContainerDied","Data":"ba8a99b1d53310cd598e93015ecdc8bff1c0871f8d9af2216aa4262da6b1fde1"} Feb 17 14:37:25 crc kubenswrapper[4804]: I0217 14:37:25.792174 4804 scope.go:117] "RemoveContainer" containerID="ba8a99b1d53310cd598e93015ecdc8bff1c0871f8d9af2216aa4262da6b1fde1" Feb 17 14:37:25 crc kubenswrapper[4804]: I0217 14:37:25.834975 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:37:25 crc kubenswrapper[4804]: I0217 14:37:25.835066 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:37:26 crc kubenswrapper[4804]: I0217 14:37:26.472809 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6mdsd_must-gather-64wjc_de9029fd-fb98-4bf0-a6fc-0baf663a4e92/gather/0.log" Feb 17 14:37:37 crc kubenswrapper[4804]: I0217 14:37:37.758602 4804 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6mdsd/must-gather-64wjc"] Feb 17 14:37:37 crc kubenswrapper[4804]: I0217 14:37:37.759309 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-6mdsd/must-gather-64wjc" podUID="de9029fd-fb98-4bf0-a6fc-0baf663a4e92" containerName="copy" containerID="cri-o://8a5d5495b17851f93d14861ab3120bb7a96ba669e31d998c9788362daafedc67" gracePeriod=2 Feb 17 14:37:37 crc kubenswrapper[4804]: I0217 14:37:37.776275 4804 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6mdsd/must-gather-64wjc"] Feb 17 14:37:37 crc kubenswrapper[4804]: I0217 14:37:37.916709 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6mdsd_must-gather-64wjc_de9029fd-fb98-4bf0-a6fc-0baf663a4e92/copy/0.log" Feb 17 14:37:37 crc kubenswrapper[4804]: I0217 14:37:37.917815 4804 generic.go:334] "Generic (PLEG): container finished" podID="de9029fd-fb98-4bf0-a6fc-0baf663a4e92" containerID="8a5d5495b17851f93d14861ab3120bb7a96ba669e31d998c9788362daafedc67" exitCode=143 Feb 17 14:37:38 crc kubenswrapper[4804]: I0217 14:37:38.176658 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6mdsd_must-gather-64wjc_de9029fd-fb98-4bf0-a6fc-0baf663a4e92/copy/0.log" Feb 17 14:37:38 crc kubenswrapper[4804]: I0217 14:37:38.177667 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6mdsd/must-gather-64wjc" Feb 17 14:37:38 crc kubenswrapper[4804]: I0217 14:37:38.295099 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xth5r\" (UniqueName: \"kubernetes.io/projected/de9029fd-fb98-4bf0-a6fc-0baf663a4e92-kube-api-access-xth5r\") pod \"de9029fd-fb98-4bf0-a6fc-0baf663a4e92\" (UID: \"de9029fd-fb98-4bf0-a6fc-0baf663a4e92\") " Feb 17 14:37:38 crc kubenswrapper[4804]: I0217 14:37:38.295260 4804 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/de9029fd-fb98-4bf0-a6fc-0baf663a4e92-must-gather-output\") pod \"de9029fd-fb98-4bf0-a6fc-0baf663a4e92\" (UID: \"de9029fd-fb98-4bf0-a6fc-0baf663a4e92\") " Feb 17 14:37:38 crc kubenswrapper[4804]: I0217 14:37:38.303531 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de9029fd-fb98-4bf0-a6fc-0baf663a4e92-kube-api-access-xth5r" (OuterVolumeSpecName: "kube-api-access-xth5r") pod "de9029fd-fb98-4bf0-a6fc-0baf663a4e92" (UID: "de9029fd-fb98-4bf0-a6fc-0baf663a4e92"). InnerVolumeSpecName "kube-api-access-xth5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:37:38 crc kubenswrapper[4804]: I0217 14:37:38.403049 4804 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xth5r\" (UniqueName: \"kubernetes.io/projected/de9029fd-fb98-4bf0-a6fc-0baf663a4e92-kube-api-access-xth5r\") on node \"crc\" DevicePath \"\"" Feb 17 14:37:38 crc kubenswrapper[4804]: I0217 14:37:38.447515 4804 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de9029fd-fb98-4bf0-a6fc-0baf663a4e92-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "de9029fd-fb98-4bf0-a6fc-0baf663a4e92" (UID: "de9029fd-fb98-4bf0-a6fc-0baf663a4e92"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:37:38 crc kubenswrapper[4804]: I0217 14:37:38.504803 4804 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/de9029fd-fb98-4bf0-a6fc-0baf663a4e92-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 17 14:37:38 crc kubenswrapper[4804]: I0217 14:37:38.586618 4804 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de9029fd-fb98-4bf0-a6fc-0baf663a4e92" path="/var/lib/kubelet/pods/de9029fd-fb98-4bf0-a6fc-0baf663a4e92/volumes" Feb 17 14:37:38 crc kubenswrapper[4804]: I0217 14:37:38.933012 4804 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6mdsd_must-gather-64wjc_de9029fd-fb98-4bf0-a6fc-0baf663a4e92/copy/0.log" Feb 17 14:37:38 crc kubenswrapper[4804]: I0217 14:37:38.934706 4804 scope.go:117] "RemoveContainer" containerID="8a5d5495b17851f93d14861ab3120bb7a96ba669e31d998c9788362daafedc67" Feb 17 14:37:38 crc kubenswrapper[4804]: I0217 14:37:38.934905 4804 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6mdsd/must-gather-64wjc" Feb 17 14:37:38 crc kubenswrapper[4804]: I0217 14:37:38.959647 4804 scope.go:117] "RemoveContainer" containerID="ba8a99b1d53310cd598e93015ecdc8bff1c0871f8d9af2216aa4262da6b1fde1" Feb 17 14:37:55 crc kubenswrapper[4804]: I0217 14:37:55.836155 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:37:55 crc kubenswrapper[4804]: I0217 14:37:55.836973 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:37:55 crc kubenswrapper[4804]: I0217 14:37:55.837046 4804 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" Feb 17 14:37:55 crc kubenswrapper[4804]: I0217 14:37:55.838457 4804 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9de6f8932aa6eb9745d883c27729ffc7b5e517e2da23231504aaed3b733a9ff5"} pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 14:37:55 crc kubenswrapper[4804]: I0217 14:37:55.838579 4804 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" containerID="cri-o://9de6f8932aa6eb9745d883c27729ffc7b5e517e2da23231504aaed3b733a9ff5" gracePeriod=600 Feb 17 14:37:56 crc kubenswrapper[4804]: I0217 14:37:56.100273 4804 generic.go:334] "Generic (PLEG): container finished" podID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerID="9de6f8932aa6eb9745d883c27729ffc7b5e517e2da23231504aaed3b733a9ff5" exitCode=0 Feb 17 14:37:56 crc kubenswrapper[4804]: I0217 14:37:56.100386 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerDied","Data":"9de6f8932aa6eb9745d883c27729ffc7b5e517e2da23231504aaed3b733a9ff5"} Feb 17 14:37:56 crc kubenswrapper[4804]: I0217 14:37:56.100530 4804 scope.go:117] "RemoveContainer" containerID="f5cab721b63cbd419aa7e5bbd4e4a6bd46150ce41fd2b5e1f7c13b37e9cab7db" Feb 17 14:37:57 crc kubenswrapper[4804]: I0217 14:37:57.114267 4804 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" event={"ID":"6992e22f-b963-46fc-ac41-4ca9938dda85","Type":"ContainerStarted","Data":"34f2b4f4028dda4d42ec0945680072c7bcde40409024f0a26b6628c15a828e33"} Feb 17 14:40:25 crc kubenswrapper[4804]: I0217 14:40:25.835186 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:40:25 crc kubenswrapper[4804]: I0217 14:40:25.835819 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:40:55 crc kubenswrapper[4804]: I0217 14:40:55.835504 4804 patch_prober.go:28] interesting pod/machine-config-daemon-zb7c5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:40:55 crc kubenswrapper[4804]: I0217 14:40:55.837135 4804 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7c5" podUID="6992e22f-b963-46fc-ac41-4ca9938dda85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"